GB2495696A - Controlling the display levels of objects in a graphical user interface during movement actions - Google Patents

Controlling the display levels of objects in a graphical user interface during movement actions Download PDF

Info

Publication number
GB2495696A
GB2495696A GB1116836.6A GB201116836A GB2495696A GB 2495696 A GB2495696 A GB 2495696A GB 201116836 A GB201116836 A GB 201116836A GB 2495696 A GB2495696 A GB 2495696A
Authority
GB
United Kingdom
Prior art keywords
display
text
displayed
display level
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1116836.6A
Other versions
GB201116836D0 (en
Inventor
Nigel Pearce
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Promethean Ltd
Original Assignee
Promethean Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Promethean Ltd filed Critical Promethean Ltd
Priority to GB1116836.6A priority Critical patent/GB2495696A/en
Publication of GB201116836D0 publication Critical patent/GB201116836D0/en
Priority to PCT/EP2012/069377 priority patent/WO2013045708A1/en
Publication of GB2495696A publication Critical patent/GB2495696A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In a graphical user interface in which multiple objects are displayed, the display levels of objects that coincide as a result of movement of one of the objects may be controlled in order to meet preferences for those objects. If an object being moved passes across another, with the potential to interfere with the active use of that object, the object being moved may have its display level adjusted 114 so that it is lower than the one it may interfere with, and so passes beneath it. The effect of changing display level may depend on the movement operation being performed 102, the nature of the other displayed object 108, or the contact point of the input controlling the movement. The interface may be provided on a touchscreen or other interactive display. The arrangement allows the manipulation and movement of display objects without disturbing or obstructing other active objects in a collaborative environment.

Description

TRANSFORMING DISPLAYED OBJECTS ON A GUI
BACKGROUND TO THE INVENTION:
Field of the Invention:
The present invention relates to the control of the display of coinciding objects in a graphical user interface (GUI) . The invention is particularly but not exclusively concerned with an interactive display surface arranged to detect an input at the surface for controlling displayed objects. Such a surface in use may be presented in a horizontal or vertical arrangement. The invention is particularly but not exclusively concerned with such surfaces being provided with touch inputs from a plurality of different sources.
Description of the Prior Art:
Interactive surfaces which are adapted to detect touch inputs are well-known in the art. Such an interactive surface may be arranged to have a display to display graphical information and/or images to a user. A user is able to touch the surface at a position at which an object is displayed in order to select the object, or move the touch contact across the surface in order to move the object. Similarly a touch contact may be used to draw or annotate on the display of the touch surface.
It is also known for the user interface and/or a software application running on a computer system to be controlled by the detection of predetermined gestures of an input at or near the interactive display surface.
Various implementations for such touch sensitive surfaces are well-knovm in the art, such as in handheld electronic devices such as mobile phones or personal data assistants (PDAs) . On a larger scale, such touch surfaces are also known as part of interactive display systems, such as electronic whiteboards. More recently, touch sensitive display surfaces have been shown as being used for interactive tables, where the display surface is disposed in a horizontal plane as a table surface.
It is also known in the art of touch sensitive display surfaces to include such surfaces in a collaborative input system, to allow for multiple users to interact with the touch sensitive display simultaneously. In practice multiple inputs can be received from a single user, as well as from a plurality of users. The interactive touch sensitive surface is adapted to be responsive to touch inputs in general, and thus is responsive to a plurality of touch inputs.
With an interactive display surface arranged as a table adapted for collaborative working, the ability to detect multiple inputs allows multiple users to be positioned around the interactive display surface and work simultaneously. Each user may work in their own defined area or with their own defined application, but also users may work collaboratively, swapping information between them.
In such collaborative working environments it is advantageous for users to be able to share tools and/or content. This may be achieved by moving a displayed object about the interactive display surface between user positions. As there may be multiple displayed objects on an interactive display surface, associated with one or more users, as displayed objects are moved around they may intersect other displayed objects. This may create a problem in obscuring displayed objects which are being actively used.
It is an aim of the present invention to provide an improved graphical user interface -The invention may be advantageously, but not exclusively, implemented in an input system comprising an interactive display surface for detecting inputs for controlling displayed objects.
StJ)UVIARY OF THE INVENTION: The invention provides a method for controlling a graphical user interface in which multiple objects are displayed, the method comprising: in dependence on positional manipulation of a displayed object resulting mat least part of the displayed object coinciding with another displayed object, controlling a display level of one object to meet a display preference for the objects.
The display level of the displayed object may be positionally manipulated is controlled. The method may further comprise: determining if the object being positionally manipulated has a display level higher than the other object; and if so adjusting the display level of the abject being positionally manipulated to a lower display level than the other object. The determining step may be enabled if the positional manipulation is a movement of the displayed object between positions. The determining step may be enabled if the movement is responsive to a throw gesture of a drag operation.
The method may further comprising: determining if the object being positionally manipulated has a display level lower than the other object; and if so adjusting the display level of the object being positionally manipulated to a higher display level than the other object. The determining step may be enabled if the positional manipulation is a resizing or rotating of the displayed object. The determining step may be enabled if the positional manipulation is a movement of the displayed object between positions under a drop operation.
The controlling step may be dependent upon a criteria.
The criteria may be whether the object being moved is coincident with any part of the other object. The criteria may be whether the contact point of an input controlling the movement of the object being moved coincides with any part of the other object.
The other displayed object may be a static object. The other displayed object may be actively selected.
The interactive display surface may be adapted to detect multiple inputs. The object being moved and the other object may be associated with different inputs. The object being moved and the other object may be currently selected by an input. Each input may be associated with a user.
The method may comprise adjusting the object being moved to the lowest display level on detection of movement of the object.
The method may further comprise the step, in dependence on the other object being a moving object, of maintaining the display of the objects in accordance with their respective default settings.
The method may further comprise disabling the controlling step on determination that an object being moved is no longer being moved.
A computer program comprising computer program may be adapted, when run on a computer, to perform the method as defined.
A computer program product for storing computer program code may be adapted, when run on a computer, to perform the method as defined.
The invention provides a controller for a display in which multiple objects are displayed on a graphical user interface of the display, the controller being adapted, in dependence on an object being positionally manipulated to coincide with another displayed object, to control a display level of one object to meet a display preference for the objects.
The controller may be adapted to control the display level of the object being positionally manipulated. The controller may further be adapted to: determine if the object being positionally manipulated has a display level higher than the other object; and if so adapted to adjust the display level of the object being positionally manipulated to a lower display level than the other object.
The controller may be adapted to enable the determination if the positional manipulation is a movement of the displayed object between points. The controller may be adapted to enable the determination if the movement is responsive to a throw gesture of a drag operation.
The controller may be further adapted to: determine if the object being positionally manipulated has a display level lower than the other object; and if so adapted to adjust the display level of the object being positionally manipulated to a higher display level than the other object.
The controller may be further adapted to enable the determination if the positional manipulation is a resizing or rotating of the displayed object.
The controller may be further adapted to enable the determination if the positional manipulation is a movement of the displayed object between positions under a drop operation.
The controller may be further adapted such that the preventing step is dependent upon a criteria.
The controller may be further adapted such that the criteria is to determine when the object being moved is coincident with any part of the other object.
The controller may be further adapted such that the criteria is to determine when the contact point of an input controlling the movement of the object being moved coincides with any part of the other object.
The controller may be further adapted such that the other displayed object is a static object.
The controller i-nay be further adapted such that the other displayed object is actively selected.
S The controller may be further adapted such that the interactive display surface is adapted to detect multiple inputs.
The controller may be further adapted such that the object being moved and the other object are associated with different inputs.
The controller may be further adapted such that the object being moved and the other object are currently selected by an input.
The controller may be further adapted such that each input is associated with a user.
The controller may be further adapted to adjust the object being moved to the lowest display level on detection of movement of the object.
The controller may be further adapted, in dependence on the other object being a moving object, to maintain the display of the objects in accordance with their respective settings.
The controller may be further adapted to disable the preventing step on determination that an object being moved is no longer being moved.
A computer system may include a display and a controller as defined.
An interactive display system may include a controller as defined.
A touch sensitive interactive display surface may include a horizontally oriented interactive display surface and a controller as defined.
The invention provides an apparatus for controlling a graphical user interface in which multiple objects are displayed, the apparatus comprising: in dependence on positional manipulation of a displayed object resulting in at least part of the displayed object coinciding with another displayed object, means for controlling a display level of one object to meet a display preference for the objects.
The display level of the displayed object being positionally manipulated may be controlled.
The apparatus may further comprise: means for determining if the object being positionally manipulated has a display level higher than the other object; and if so means for adjusting the display level of the object being positionally manipulated to a lower display level than the other object.
The means for determining may be enabled if the positional manipulation is a movement of the displayed object between positions.
The means f or determining may be enabled if the movement is responsive to a throw gesture of a drag operation.
The apparatus may further comprise: means for determining if the object being positionally manipulated has a display level lower than the other object; and if so means for adjusting the display level of the object being positionally manipulated to a higher display level than the other object.
The means for determining may be enabled if the positional manipulation is a resizing or rotating of the displayed object.
The means for determining may be enabled if the positional manipulation is a movement of the displayed object between positions under a drop operation.
The controlling step may be dependent upon a criteria.
BRIEF DESCRIPTION OF THE FIGURES:
The invention is described by way of example with reference to the figures, in which: Figure 1 illustrates an exemplary scenario in which embodiments of the invention may be implemented; Figures 2(a) to 2(c) illustrate the movement of a displayed object on a graphical user interface (GUI) of an interactive display system; Figure 3 illustrates a process in accordance with an embodiment of the invention; Figure 4 illustrates a process in accordance with a further embodiment of the invention; Figure 5 illustrates functional elements for implementing the methods in accordance with embodiments of the invention; and Figure 6 illustrates an exemplary display system in which embodiments of the present invention may be utilised.
DESCRIPTION OF' THE PREFERRED EMBODIMENTS:
The invention is described by way of reference to various examples, embodiments, and advantageous applications. One skilled in the art will appreciate that the invention is not limited to the details of any described example, embodiment or detail. In particular the invention may be described with reference to exemplary interactive display systems. One skilled in the art will appreciate that the principles of the invention are not limited to any such described systems.
The invention is described by way of reference to examples of interactive display surfaces or systems. The invention is not however limited to interactive display systems or surfaces and in general is applicable to any system in which two or more objects are displayed on a GUI, and at least one displayed object can be positionally manipulated.
The term manipulate', and its derivatives, in the context of a displayed object refers to altering the position, size or angle (orientation) of the object by moving the object, resizing the object, or rotating the object. This object manipulation may be more specifically referred to as positional manipulation' of an object.
The invention is described herein with reference to a touch sensitive interactive display surface for collaborative working. The invention is particularly described in the context of such a surface provided as a horizontal -or table-top' -surface. However the invention is not limited to any such specific arrangement.
The invention is not limited to touch sensitive interactive display surfaces, nor is it limited to horizontally disposed interactive display surfaces.
The invention is not limited to any particular type of touch sensitive technology, nor to any particular type of display technology. In examples, the display of the touch sensitive surface may be provided by a projector projecting images onto the touch sensitive surface. In other examples the display may be provided by the touch sensitive surface being an emissive surface. Various other options exist as will be understood by one skilled in the art. In general the surface is described herein as a touch sensitive surface, which may have images projected thereon (e.g. by a projector) or which may also be an emissive display surface.
Figure 1 shows an arrangement for collaborative working at a horizontal interactive display surface, providing an exemplary implementation scenario for embodiments of the present invention.
Multiple users 6, 8, 10 are positioned around a horizontally disposed interactive display surface 4 of a table-top type structure 2. Each user 6, 8, 10 has in front of them a working area 7, 9, 11 respectively, each comprising a displayed object. The working areas 7, 9, 11 may be defined by the display of an application on the interactive display surface. In general the working area is an area in which a user is able to provide inputs at the interactive display surface, and may be a physical area within which multiple displayed items such as displayed windows may be provided for interaction. For the purpose of the described embodiments it is assumed that the working areas 7, 9, 11 are windows associated with applications being used by the respective users.
In addition, as shown in Figure 1, other displayed objects 12, 14 may be positioned on the display surface, which may be associated with one or more of the users, or with an application running on a computer system with which the interactive display surface is associated.
In the example arrangement of Figure 1 the working areas may be defined by the display of an application on the interactive display surface. Some users may have multiple windows, associated with the different applications or different windows within a single application, or windows comprising tools and windows comprising content.
In collaborative tasks, for example, users may share or interchange content and/or tools. An example is illustrated in Figures 2(a) to 2(c) n As shown in Figure 2(a), a displayed object 24 is positioned in a portion of the interactive display surface 4 in front of a user 20, such that the user may interact with the displayed object 24. A second user 22 is positioned distant from the first user, in the illustrated example diagonally opposite the first user. Two further displayed objects 26 and 28 are positioned on the interactive display surface.
In the described example, the first user 20 positionally manipulates the displayed object 24 by moving it to the second user 22. In accordance with known techniques, the displayed object 24 may be moved by the user 22 selecting the object 24 by placing a finger on the interactive display surface within the area occupied by the displayed object. The user 22 may then drag their finger along the surface until the displayed object 24 is in a new, desired position. Alternatively the user may move their finger in accordance with a predetermined gesture to throw' the displayed object in a particular direction. The mechanism by which a displayed object is controlled to be moved is not relevant to the invention, and the invention is not limited to any specific tecimique. All techniques for moving a displayed object, or more generally positionally manipulating a displayed object, are encompassed within the scope of the invention. However, as will be discussed further hereinbelow, particular embodiments of the invention may be associated with particular techniques.
As illustrated in Figure 2(a), the first user 20 selects the object 24 for moving to another area of the display surface adjacent the user 22. As illustrated in Figure 2(a), the user 20 touches the object at a point 25 and moves it in the direction of the arrow or vector 30 toward the second user 22.
As illustrated in Figure 2(b), as the displayed object 24 is moved toward the user 22, it may intersect with other displayed objects. Figure 2(b) shows the object 24 intersecting the displayed object 26 as it is moved in a direction toward user 22 as denoted by arrow 32. point 27 denotes the point of contact of the object with a user's finger, in this embodiment, as the finger is moved across the surface.
As illustrated in Figure 2(c), the displayed object 24 is moved to its final position in front of the user 22. The user's finger is released at point 29, and the arrow and dashed line 34 denote the path through which the finger, and the displayed object, have travelled.
It is well-known in display systems that displayed objects have display levels, such that in the event that two or more displayed objects (or portions of two or more displayed objects) coincide, the object (or portion of an object) which is displayed is determined by which of the coinciding objects has the highest display level or priority. The object with the highest display level or priority is displayed, and the other object is obscured.
Preferably only the parts of the coinciding objects which directly coincide are processed in this way.
In the arrangement as shown in Figure 2 (b), as the displayed object 24 moves across the display surface 4 it becomes coincident with the displ.ay of other objects, such as the displayed object 26. If those other objects are in use, and the moving object has a higher display level, the moving object may obscure all or part of an in-use static object.
This is disadvantageous, as it is more likely that a static object is in use -being viewed -than a moving object. In general a moving object is being moved from one position to another and is not required to be viewed as it is moved.
The preferred arrangement of the invention provides a technique, an embodiment of which is exemplified by the flow process of Figure 3, in which a first object such as a moving object (e.g. object 24) is selectively prevented from obscuring another object, such as a static object (e.g. object 26), so as to selectively protect one object (e.g. a static object) from being obscured by another object (e.g. a moving object), even when the display level or priority level of the moving object is higher than that of the static object.
With reference to Figure 3, a preferred process implementation is now described.
In a step 100, if an object is detected as moving, then a process in accordance with embodiments of the invention is enabled for that object. In a step 102 it is determined, if appropriate, a current mode of operation. In the described example, the implementation of the process is dependent upon a mode of operation, and therefore the implementation of the invention is selective. In other embodiments the process may be implemented in a fixed mariner responsive to detection of a moving object.
In the described embodiment, the system may be in one of three modes of operation on detection of a moving object: (i) throw mode; (ii) drag mode; or (iii) drag and drop mode.
The mode may be selected by a user, or may be determined by the current state of an application with which the displayed object is associated. Other mechanisms may determine the mode of operation.
If it is determined in step 102 that the mode of operation is throw mode, then the process moves on to step 104. In a throw mode of operation, the movement of the displayed object is controlled by detection of a user input contact on the surface at a position where the displayed object is displayed to select the object. A gesture to throw the displayed object to another position on the interactive display surface is then made. This gesture may be a short movement of the finger on the surface in the direction of intended movement. The length of movement between the start and end points of the gesture, and/or the speed of movement between the start and end points, may determine the distance by which the displayed object is moved in the given direction.
If it is determined in step 102 that the mode of operation is drag mode, then the process moves on to step 104. In a drag mode of operation, the movement of the displayed object is controlled by detection of a user input contact on the surface at a position where the displayed object is displayed to select the object, followed by continuous movement of the finger along the surface. The movement of the object ends, and the object is positioned in a corresponding new location, at the position that the contact with the surface is released.
If it is determined in step 102 that the mode of operation is drag and drop mode, then the process moves on to step 106. In a drag and drop mode of operation, the movement of the displayed object is controlled by detection of a user input contact on the surface at a position where the displayed object is displayed to select the object, followed by continuous movement of the finger along the surface. The movement of the object ends, and the object is positioned in a corresponding new location, at the position that the contact with the surface is released, similar to a drag mode of operation. However in a drag and drop mode of operation, the object is dragged over another object when the contact with the surface is related by the user, such that the moved object is dropped onto that other object.
This may allow, for example, an object to be dropped into a container object.
The above-described operations relate to a touch sensitive interactive display surface where an object is selected and manipulated by a finger contact on the display surface. One skilled in the art will understand how an object is selected and manipulated in other types of input system, for example where inputs are received from a mouse.
Depending on which mode of operation is enabled, as denoted by steps 104 and 106 a different comparison step dependent upon a distinct criteria is made before the process continues in a common step 108.
In step 104 it is determined if the moving object is coincident with a further object. The result of this step will be affirmative if any portion of the moving object S overlaps the other object. In dependence on an affirmative result, the process moves to step 108. otherwise the process returns to step 100.
In step 106 it is determined if the contact point on the moving object associated with the input device is coincident with a further object. The result of this step will be affirmative if only the contact point overlaps the other object. In dependence on an affirmative result, the process moves to step 108. Otherwise the process returns to step 100.
In step 108 it is then determined whether the further object, with which the moving object is determined as being coincident with in dependence upon the relative criteria, is static. If it is determined in step 108 that the further object is moving then the process returns to step 100. In the event that both coincident objects are moving, then in accordance with this embodiment the display of either object is not given priority over the other, and the default display priorities for each object are not overridden.
If it is determined in step 108 that the further object is static, it is then optionally determined in step 110 whether the further object is active. The further object may be determined as active if it is associated with a currently active application.
If it is determined in optional step 108 that the further object is not active then the process returns to step 100. If the object with which the moving object is coincident is not active, then in an embodiment there may be no requirement to control the display priorities to prevent it being obscured.
If it is determined in step 110 that the further object is active, or if step 110 is not implemented, then it is determined in step 112 whether the display level of the moving object is greater than or equal to the display level of the other object.
In accordance with the invention, as denoted by step 114, if the display level of the moving object is greater than or equal to the display level of the further object, the display level of the moving object is changed to be less than the display level of the further object.
If in step 112 it is determined that the display level of the moving object is not greater than or equal to the display level of the further object, then the process reverts to step 100 in this embodiment.
In step 116, it is then determined if the object is still coincident with the further object in accordance with the appropriate criteria for the mode of operation, according to earlier steps 104 and 106.
If the appropriate criteria is still met, then in step 118 it is determined if the object is still moving, and as long as it is still moving the test of step 116 is maintained. As long as the moving object is coincident with the further object according to the relevant criteria of steps 104 and 106, the display level of the moving object is maintained at the lower level.
Once the moving object is determined to be no longer coincident with the further object according to the appropriate criteria, as determined iii step 116, or if the moving object is detected as no longer moving, as determined by step 118, the process moves on to step 120.
In step 120 the display level of the moving object is returned to its correct level, according to the level associated with the moving object prior to step 114.
It will be understood that in the described example with reference to Figure 3 the movement of an object between positions is an example of a positional manipulation of the object. Alternative positional manipulations of an object, which could result in it becoming coincident with another object, are resizing of the object and rotating of the object.
In the described embodiment, the aim is to prevent an object being positionally manipulated from obscuring or inhibiting the display of at least part of a further object.
In alternative arrangements the aim may be to prevent a static object from obscuring or inhibiting the display of an object being positionally manipulated. For example if an object is being resized of rotated, it may be desirable to ensure it is visible whilst the manipulation takes place, so the manipulation can take place accurately.
An alternative embodiment to the exemplary process of Figure 3 is described now with reference to the flow process of Figure 4.
In a step 150 it is determined if an object is moving.
Responsive to detection of a moving object, in step 152 the display level of the moving object is set to the lowest available display level.
In step 154 the state of the object is monitored to determine if it is moving. When it is detected in step 154 that the object is no longer moving, in step 156 the display level of the object is restored to its original level.
Thus in the embodiment of Figure 4 the display level of the moving object may be temporarily set to the lowest possible display level as soon as it is detected as moving, and restored to its default level when it is detected as having stopped moving.
In an embodiment, if two objects having the same display level coincide, and only one of the objects is moving, then the static object is preferably displayed.
Such an embodiment may not be preferable in a drag and drop mode of operation, in which instance it may be preferable for the moving object to be displayed over the top' of any static object, to select the container into which the moving object is to be placed. This is a drag and drop mode of operation. The display or priority level of the moving object may be set higher than that of the static object.
Thus the display level of the moving object set in step 114 may be higher or lower than that of the coincident static object. This may be in dependence on the type of positional manipulation (movement, re-sizing, rotation) or the sub-type of positional manipulation (e.g. drag or drag and drop).
One skilled in the art will appreciate that dependent upon the implementation, the moving object and the object with which it coincides, according to any criteria, may be controlled such that one or the other has its display level adapted to ensure the appropriate one of the objects is displayed for the implementation.
In an embodiment, in a throw mode of operation as described above a moving object is preferably always set at a lower display priority than a further object with which it coincides.
In an embodiment, in a drag mode of operation (or a drag and drop mode of operation) a moving object is preferably only set at a lower display priority than a further object with which it coincides if the contact point with which the object is being dragged coincides with that further object.
In an embodiment, during drag and drop operation, if a displayed object is dragged such that a finger making contact with the interactive display surface traces a path through the gaps' between other displayed objects, then the dragged object is preferably set at a display level lower than the others if part of it is coincident with the other objects. However if during the dragging operation the finger contact point moves over one of the other displayed objects, then the dragged displayed object preferably has its display setting changed such that it is displayed on top, suggesting a drop action.
In an embodiment, an object being resized is preferably set at a higher display level than another object if at least part of it coincides with the other object.
In an embodiment, an object being rotated is preferably set at a higher display level than another object if at least part of it coincides with the other object.
Thus the enablement of the techniques in accordance with the invention and its embodiments is preferably selective, dependent upon an implementation.
Whilst in the preferred described embodiments the display level of the moving object is controlled to be set lower than an object with which it coincides, in other embodiments the display level of the moving object is controlled to be set higher than an object with which it coincides.
In an alternative arrangement the display setting of the further object with which the moving object coincides may be changed in order for the appropriate one of the objects to be displayed on top' in accordance with the implementation.
In general, in dependence upon positional manipulation of a displayed object resulting in at least part of the object coinciding with another displayed object, the display level of one or other of the objects is controlled to meet a display preference for the objects. The display preference may be that the object being positionally manipulated should be displayed in preference to the other object, or vice versa, as discussed above.
With reference to Figure 5 there is illustrated the functional components of a computer system associated with an interactive display system required to support the functionality associated with one or more embodiments of the invention. Only those components that are relevant to understanding the invention and its embodiments are illustrated. One skilled in the art will appreciate that additional functional elements will be required to fully implement an appropriate computing system. The described functional components may be implemented in a variety of ways, for example on a computer system associated with the interactive display surface, such as a personal computer, laptop computer or other portable or handheld computing device, or integrated with the interactive display surface itself.
The computing system includes a processor 516 an interactive display system (IDS) driver 506; an interactive display system (IDS) input detection circuit 508; a coordinate detection circuit 514; a display driver 510; a display object generator 512; a display level temporary store 520; a display level controller 518; a display level comparator 520; a first object data store 524; and a second object data store 526.
Each of the first and second object data stores 524 and 526 includes a respective displayed object identifier 528 and 530; and a respective characteristics data store 532 and 534, each of which includes a display level data store 536 and 538.
The interactive display surface driver 506 receives data representing detected inputs (such as touch inputs) at the interactive display surface, and forwards such inputs to the interactive display surface detection circuit 508. The interactive display surface input detection circuit 508 provides the received input data to the coordinate detection circuit 514 to determine coordinate information for each input, such as each touch input.
The coordinate detection circuit 514 provides coordinate information associated with each detected input to the processor 516. The object data stores 524 and 526 represent the object data stores for two objects which are being compared, and there may be other object data stores associated with other objects displayed on the interactive display surface. The comparator 520 compares the current display level from the display level stores 536 and 538 of the two objects, and generates a result to the display level controller 518. In dependence thereon, dependent upon a criteria being met, the display level controller 518 controls the display level stores 536 and 538 to selectively adjust one of the display levels by reducing or increasing the default display level for that object. The default display level which is changed is then stored in display level temporary store 510 under the control of display level controller 518. Subsequently, when the display level of a given object is returned to its default state, the display level controller 518 retrieves the default display level from the store 510, and enters it into the appropriate display level store 536 or 538.
The displayed object generator 512 operates under control of the processor 516 to generate the appropriate information for the display driver 510 to display graphical representations of the objects on the display. The displayed object generator 512 thus receives object data from the object data stores 524 and 526, including the current display levels of the respective objects.
The invention and its embodiments are broadly applicable to any computer system including a display, and in which an object displayed on a graphical user interface on the display can be positionally manipulated under the control of one or more user inputs.
In Figure 6, there is illustrated an exemplary environment in which the principles of the present invention may be utilised in an interactive display system. Figure 6 illustrates an interactive display system generally denoted by reference numeral 400, including a touch-sensitive display system 401, and a computer 408.
The touch-sensitive display system 401 includes a touch-sensitive display surface 402, on which is displayed graphical elements such as displayed objects as described herein in various embodiments. Also illustrated in Figure 6 is a hand 404 of a user being used at the touch-sensitive display surface 402 to provide inputs.
The computer 408 is connected to the touch-sensitive display system 302 via a connection 416, which may be wireless or wired. Computer software including applications running on the computer system 408 control the display of graphical information on the touch-sensitive display surface 402, and are controlled by inputs detected at the touch-sensitive display surface 402. The computer system 408 may be provided with its own monitor 410, which may display the same information as displayed on the interactive display surf ace 402 or different information. As illustrated in Figure 7, the computer system 408 may also be connected to other types of input device, such as a mouse input device 412 and a keyboard input device 414, and receive and process inputs from such devices.
An interactive display system 400 as illustrated in Figure 6 may be provided in a classroom environment for educational purposes.
The touch-sensitive display system 401 may be provided as a large vertical display surface, which is viewable by a large number of users in, for example, a classroom. The touch-sensitive display system 401 may alternatively be provided as a horizontal display surface comprising an interactive table, which is usable by a. number of users in a classroom.
The touch-sensitive display system 401 provides for collaborative working by a number of users.
The provision of touch inputs at the touch-sensitive display surface 402 may be provided by a teacher or students in a classroom.
Figure 6 represents only one exemplary implementation of the invention.
The touch-sensitive display system 401 may also be provided as a small-scale device, such as a portable or hand-held device, with a touch-sensitive display surface.
The invention is not limited to any particular type of display technology. The touch-sensitive display system 401 of Figure 6 may, for example, comprise an emissive display surface. The touch-sensitive display system 401 of Figure 6 may comprise a surface onto which images are projected by a projector.
Further, and as noted hereinabove, the invention is not limited in its applicability to touch-sensitive display systems, and in general is applicable to any computer system associated with a display, whether an interactive display system or otherwise.
As will be clear to one skilled in the art, numerous embodiments of interactive display systems may be used to practice the present invention, e.g. to run the methods described herein as part of an interactive program stored on storage media such as computer memory storage or peripheral memory storage which is accessible by a computer, and the above description of an exemplary application in an interactive display system is not intended to limit the breadth of coverage provided by the invention. For example, the program software for practicing the present invention may be stored on a disk (such as a CD or other data storage media), may be downloaded into a computer's memory to be run, or may run from disk.
In general the invention may be implemented in software, the software comprising computer program code which, when executed on a computer, causes the computer to operate in accordance with methods as described in. The computer program code may be stored on any type of computer readable medium.
In general the invention may be implemented in a computer system, by providing and/or adapting computer hardware to operate in accordance with functionality as described herein.
All examples and embodiments described herein may be combined in various combinations, and are not mutually exclusive.
The invention has been described herein by way of reference to particular examples and exemplary embodiments.
One skilled in the art will appreciate that the invention is not limited to the details of the specific examples and exemplary embodiments set forth. Numerous other embodiments may be envisaged without departing from the scope of the invention, which is defined by the appending claims.

Claims (1)

  1. <claim-text>Claims 1. A method for controlling a graphical user interface in S which multiple objects are displayed, the method comprising: in dependence on positional manipulation of a displayed object resulting mat least part of the displayed object coinciding with another displayed object, controlling a display level of one object to meet a display preference for the objects.</claim-text> <claim-text>2. A method according to claim 1 wherein the display level of the displayed object being positionally manipulated is controlled.</claim-text> <claim-text>3. A method according to claim 2 further comprising: determining if the object being positionally manipulated has a display level higher than the other object; and if so adjusting the display level of the object being positionally manipulated to a lower display level than the other object.</claim-text> <claim-text>4. A method according to claim 3 wherein the determining step is enabled if the positional manipulation is a movement of the displayed object between positions.</claim-text> <claim-text>5. A method according to claim 4 wherein the determining step is enabled if the movement is responsive to a throw gesture of a drag operation.</claim-text> <claim-text>6. A method according to claim 2 further comprising: determining if the object being positionally manipulated has a display level lower than the other object; and if so adjusting the display level of the object being positionally manipulated to a higher display level than the other object.</claim-text> <claim-text>7. A method according to claim 6 wherein the determining step is enabled if the positional manipulation is a resizing or rotating of the displayed object.</claim-text> <claim-text>8. A method according to claim 6 wherein the determining step is enabled if the positional manipulation is a movement of the displayed object between positions under a drop operation.</claim-text> <claim-text>9. A method according to any preceding claim wherein the controlling step is dependent upon a criteria.</claim-text> <claim-text>10. A method according to claim 9 wherein the criteria is whether the object being moved is coincident with any part of the other object.</claim-text> <claim-text>11. A method according to claim 9 wherein the criteria is whether the contact point of an input controlling the movement of the object being moved coincides with any part of the other object.</claim-text> <claim-text>12. A method according to any one of claims 1 to 11 wherein the other displayed object is a static object.</claim-text> <claim-text>13. A method according to any one of claims 1 to 12 wherein the other displayed object is actively selected.</claim-text> <claim-text>14. A method according to any one of claims 1 to 13 wherein the interactive display surface is adapted to detect multiple inputs.</claim-text> <claim-text>15. A method according to claim 14 wherein the object being moved and the other object are associated with different inputs - 16. A method according to claim 15 wherein the object being moved and the other object are currently selected by an input.17. A method according to claim 16 wherein each input is associated with a user.18. A method according to any preceding claim further comprising adjusting the object being moved to the lowest display level on detection of movement of the object.19. A method according to any preceding claim further comprising the step, in dependence on the other object being a moving object, of maintaining the display of the objects in accordance with their respective default settings.20. A method according to any preceding claim further comprising disabling the controlling step on determination that an object being moved is no longer being moved.21. A computer program comprising computer program adapted, when run on a computer, to perform the method of any one of claims 1 to 20.22. A computer program product for storing computer program code adapted, when run on a computer, to perform the method of any one of claims 1 to 20.23. A controller for a display in which multiple objects are displayed on a graphical user interface of the display, the controller being adapted, in dependence on an object being positionally manipulated to coincide with another displayed object, to control a display level of one object to meet a display preference for the objects.24. A controller according to claim 23 wherin the controller is adapted to control the display level of the object being positionally manipulated.25. A controller according to claim 24 further adapted to: determine it the object being positionally manipulated has a display level higher than the other object; and if so adapted to adjust the display level of the object being positionally manipulated to a lower display level than the other object.26. A controller according to claim 25 wherein the controller is adapted to enable the determination if the positional manipulation is a movement of the displayed object between points.27. A controller according to claim 26 wherein the controller is adapted to enable the determination if the movement is responsive to a throw gesture of a drag operation.28. A controller according to claim 24 further adapted to: determine if the object being positionally manipulated has a display level lower than the other object; and if so adapted to adjust the display level of the object being positionally manipulated to a higher display level than the other object.29. A controller according to claim 28 wherein the controller is further adapted to enable the determination if the positional manipulation is a resizing or rotating of the displayed object.30. A controller according to claim 28 wherein the controller is further adapted to enable the determination if the positional manipulation is a movement of the displayed object between positions under a drop operation.31. A controller according to any one of claims 22 to 30 further adapted such that the preventing step is dependent upon a criteria.32. A controller according to claim 31 further adapted such that the criteria is to determine when the object being moved is coincident with any part of the other object.33. A controller according to claim 20 further adapted such that the criteria is to determine when the contact point of an input controlling the movement of the object being moved coincides with any part of the other object.34. A controller according to any one of claims 23 to 33 further adapted such that the other displayed object is a static object.35. A controller according to any one of claims 23 to 34 further adapted such that the other displayed object is actively selected.36. A controller according to any one of claims 23 to 34 further adapted such that the interactive display surface is adapted to detect multiple inputs.37. A controller according to claim 36 further adapted such that the object being moved and the other object are associated with different inputs.38. A controller according to claim 37 further adapted such that the object being moved and the other object are currently selected by an input.39. A controller according to claim 38 further adapted such that each input is associated with a user.40. A controller according to any one of claims 23 to 39 further adapted to adjust the object being moved to the lowest display level on detection of movement of the object.41. A controller according to any one of claims 23 to 40 further adapted, in dependence on the other object being a moving object, to maintain the display of the objects in accordance with their respective settings.42. A controller according to any one of claims 23 to 41 further adapted to disable the preventing step on determination that an object being moved is no longer being moved.43. A computer system including a display and a controller according to any one of claims 23 to 42.44. An interactive display system including a controller according to any one of claims 23 to 42.45. A touch sensitive interactive display surface including a horizontally oriented interactive display surface and a controller according to any one of claims 23 to 42.46. An apparatus for controlling a graphical user interface in which multiple objects are displayed, the apparatus comprising: in dependence on positional manipulation of a displayed object resulting in at least part of the displayed object coinciding with another displayed object, means for controlling a display level of one object to meet a display preference for the objects.47. An apparatus according to claim 46 wherein the display level of the displayed object being positionally manipulated is controlled.48. An apparatus according to claim 47 further comprising: means for determining if the object being positionally manipulated has a display level higher than the other object; and if so means for adjusting the display level of the object being positionally manipulated to a lower display level than the other object.49. An apparatus according to claim 48 wherein the means for determining is enabled if the positional manipulation is a movement of the displayed object between positions.50. An apparatus according to claim 49 wherein the means for determining is enabled if the movement is responsive to a throw gesture of a drag operation.51. An apparatus according to claim 47 further comprising: means for determining if the object being positionally manipulated has a display level lower than the other object; and if so means for adjusting the display level of the object being positionally manipulated to a higher display level than the other object.52. An apparatus according to claim 51 wherein the means for determining is enabled if the positional manipulation is a resizing or rotating of the displayed object.53. An apparatus according to claim 51 wherein the means for determining is enabled if the positional manipulation is a movement of the displayed object between positions under a drop operation.54. An apparatus according to any one of claims 48 to 52 wherein the controlling step is dependent upon a criteria.</claim-text>
GB1116836.6A 2011-09-30 2011-09-30 Controlling the display levels of objects in a graphical user interface during movement actions Withdrawn GB2495696A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1116836.6A GB2495696A (en) 2011-09-30 2011-09-30 Controlling the display levels of objects in a graphical user interface during movement actions
PCT/EP2012/069377 WO2013045708A1 (en) 2011-09-30 2012-10-01 Transforming displayed objects on a gui

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1116836.6A GB2495696A (en) 2011-09-30 2011-09-30 Controlling the display levels of objects in a graphical user interface during movement actions

Publications (2)

Publication Number Publication Date
GB201116836D0 GB201116836D0 (en) 2011-11-09
GB2495696A true GB2495696A (en) 2013-04-24

Family

ID=44994216

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1116836.6A Withdrawn GB2495696A (en) 2011-09-30 2011-09-30 Controlling the display levels of objects in a graphical user interface during movement actions

Country Status (2)

Country Link
GB (1) GB2495696A (en)
WO (1) WO2013045708A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0514307A2 (en) * 1991-05-17 1992-11-19 International Business Machines Corporation Method and apparatus for selectively revealing obscured portions of a viewport during graphic user interface drag and drop operations
US20080165153A1 (en) * 2007-01-07 2008-07-10 Andrew Emilio Platzer Portable Multifunction Device, Method, and Graphical User Interface Supporting User Navigations of Graphical Objects on a Touch Screen Display
US20100125806A1 (en) * 2008-11-20 2010-05-20 Canon Kabushiki Kaisha Information processing apparatus and method of controlling the same

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5598524A (en) * 1993-03-03 1997-01-28 Apple Computer, Inc. Method and apparatus for improved manipulation of data between an application program and the files system on a computer-controlled display system
US6915490B1 (en) * 2000-09-29 2005-07-05 Apple Computer Inc. Method for dragging and dropping between multiple layered windows
US7055105B2 (en) * 2000-10-27 2006-05-30 Siemens Aktiengesellschaft Drop-enabled tabbed dialogs
JP2007058785A (en) * 2005-08-26 2007-03-08 Canon Inc Information processor, and operating method for drag object in the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0514307A2 (en) * 1991-05-17 1992-11-19 International Business Machines Corporation Method and apparatus for selectively revealing obscured portions of a viewport during graphic user interface drag and drop operations
US20080165153A1 (en) * 2007-01-07 2008-07-10 Andrew Emilio Platzer Portable Multifunction Device, Method, and Graphical User Interface Supporting User Navigations of Graphical Objects on a Touch Screen Display
US20100125806A1 (en) * 2008-11-20 2010-05-20 Canon Kabushiki Kaisha Information processing apparatus and method of controlling the same

Also Published As

Publication number Publication date
WO2013045708A1 (en) 2013-04-04
GB201116836D0 (en) 2011-11-09

Similar Documents

Publication Publication Date Title
US9804761B2 (en) Gesture-based touch screen magnification
US8330733B2 (en) Bi-modal multiscreen interactivity
US10198163B2 (en) Electronic device and controlling method and program therefor
AU2013223015B2 (en) Method and apparatus for moving contents in terminal
US10061477B2 (en) Gesture controlled user interface
KR102137240B1 (en) Method for adjusting display area and an electronic device thereof
AU2014312481B2 (en) Display apparatus, portable device and screen display methods thereof
US8443302B2 (en) Systems and methods of touchless interaction
US9841890B2 (en) Information processing device and information processing method for improving operability in selecting graphical user interface by generating multiple virtual points of contact
EP2612220B1 (en) Method and apparatus for interfacing
US20120311472A1 (en) Apparatus and method for providing graphical user interface
US20130155108A1 (en) Augmented Reality User Interaction Methods, Computing Devices, And Articles Of Manufacture
EP2605119A2 (en) Touch input method and apparatus of portable terminal
EP2790096A2 (en) Object display method and apparatus of portable electronic device
US20130132889A1 (en) Information processing apparatus and information processing method to achieve efficient screen scrolling
JP2015508547A (en) Direction control using touch-sensitive devices
US10019148B2 (en) Method and apparatus for controlling virtual screen
KR20120100148A (en) Contents control method and device using touch, recording medium for the same and user terminal having it
CN104503697B (en) A kind of information processing method and electronic equipment
US20150100912A1 (en) Portable electronic device and method for controlling the same
GB2495696A (en) Controlling the display levels of objects in a graphical user interface during movement actions
JP2012164115A (en) Operation control device, operation control program and operation control method
EP2724223A1 (en) Exchanging content and tools between users
KR20130068700A (en) Method and apparatus for displaying a electronic book
US10831365B2 (en) Method for controlling a display device at the edge of an information element to be displayed

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20130919 AND 20130925

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)