WO2013000944A1 - Storing and applying optimum set-up data - Google Patents
Storing and applying optimum set-up data Download PDFInfo
- Publication number
- WO2013000944A1 WO2013000944A1 PCT/EP2012/062429 EP2012062429W WO2013000944A1 WO 2013000944 A1 WO2013000944 A1 WO 2013000944A1 EP 2012062429 W EP2012062429 W EP 2012062429W WO 2013000944 A1 WO2013000944 A1 WO 2013000944A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- setting
- displayed
- display surface
- interactive display
- Prior art date
Links
- 230000002452 interceptive effect Effects 0.000 claims abstract description 66
- 238000000034 method Methods 0.000 claims abstract description 44
- 230000001419 dependent effect Effects 0.000 claims description 8
- 230000004048 modification Effects 0.000 claims description 8
- 238000012986 modification Methods 0.000 claims description 8
- 238000004590 computer program Methods 0.000 claims 3
- 238000001514 detection method Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 10
- 230000006978 adaptation Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 2
- 208000002925 dental caries Diseases 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000004513 sizing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Abstract
There is disclosed. A method for an interactive display surface adapted to detect an input, the method comprising: allocating a setting to a displayed object associated with an input; and selectively applying the setting to the displayed tool.
Description
STORING AND APPLYING OPTIMUM SET-UP DATA
BACKGROUND TO THE INVENTION: Field of the Invention:
The present invention relates to interactive display surfaces arranged to detect multiple inputs, which surfaces in use may be presented in a horizontal or vertical arrangement. The invention is particularly but not exclusively concerned with such surfaces being provided with touch inputs from a plurality of different sources.
Description of the Related Art:
Interactive surfaces which are adapted to detect touch inputs are well-known in the art. Such an interactive surface may be arranged to have a display to display graphical information and/or images to a user. A user is able to touch the surface at a position at which an object is displayed in order to select the object, or move the touch contact across the surface in order to move the object. Similarly a touch contact may be used to draw or annotate on the display of the touch surface .
Various applications for such touch sensitive surfaces are well-known in the art, such as in handheld electronic devices such as mobile phones or personal data assistants (PDAs) . On a larger scale, such touch surfaces are also known as part of interactive display systems, such as electronic whiteboards. More recently, touch sensitive display surfaces have been shown
as being used for interactive tables, where the display surface is disposed in a horizontal plane as a table surface.
It is also known in the art of touch sensitive display surfaces to include such surfaces in a collaborative input system, to allow for multiple users to interact with the touch sensitive display simultaneously. In practice multiple inputs can be received from a single user, as well as from a plurality of users. The interactive touch sensitive surface is adapted to be responsive to touch inputs in general, and thus is responsive to a plurality of touch inputs.
With an interactive display surface arranged as a table adapted for collaborative working, the ability to detect multiple inputs allows multiple users to be positioned around the interactive display surface and work simultaneously. Each user . may work in their own defined area or with their own defined application, but also users may work collaboratively, swapping information between them.
Each user may have multiple applications, for example, active on the surface at any time.
It is an aim of the present invention to provide an improved user interface for a multiple input system comprising an interactive display surface for detecting multiple inputs, to facilitate working.
SUMMARY OF THE INVENTION:
In one aspect there is provided a method for an interactive display surface adapted to detect an input, the method
comprising: allocating a setting to a displayed object associated with an input; and selectively applying the setting to the displayed tool.
The input may be associated with a user, the setting comprising a user setting. The method may further comprise displaying a graphical user interface icon associated with said object, the step of selectively applying the setting to the displayed tool being enabled on selection of said icon.
There may be provided a plurality of displayed objects associated with said input, the method further comprising determining the associated objects to which the step of selectively applying the setting applies to. The step of selectively applying may apply to the object associated with the displayed icon.
The step of selectively applying may apply to all objects associated with the user with which the object with the displayed icon is associated.
The setting may comprise one or more of: the location of the object relative to an edge of the display; or the orientation of the object relative to an edge of the display.
The setting may comprise one or more of: the location of the object relative to a location of a user associated with the object; or the orientation of the object relative to a location of the user associated with the object.
The method may further comprise positioning a displayed object associated with a user, wherein the setting comprises one or more of: the location of the object relative to the position of the displayed object associated with the user; or the
orientation of the object relative to the position of the displayed object associated with the user. The displayed object associated with the user may be positioned by the user.
The setting may comprise one or more of: the size of the object; the orientation of the object relative to an application or content with which it is associated; the object opacity; or the object background colour.
The step of allocating the setting may comprise a configuration step. The configuration step may determine one or more default settings for the object.
The step of allocating the setting may comprise a modification step.
The modification step may adjust a setting for the object, wherein the step of selectively applying the setting to the object comprises applying the last known setting.
The setting may be the last known setting for modifying content of the type of the current content type.
The setting may be the last known setting for modifying the current content.
The input may be associated with a user, the displayed object being one of one or more displayed objects associated with the user, there further being displayed a graphical user interface icon associated with the user, wherein the step of selectively applying the setting to the displayed object is enabled responsive to selection of the graphical user interface icon associated with the user, the setting being applied to all displayed objects associated with the user.
There may be displayed a further graphical user interface icon associated with the graphical user interface icon associated with the user, the step of selectively applying being dependent on selection of the further icon.
In dependence on one or more settings being associated with a position of the user, the user' s position may be determined by the position of the graphical user icon associated with the user .
The invention in a further aspect provides a computer system for an interactive display surface adapted to detect an input, and further adapted to: allocate a setting to a displayed object associated with an input; and selectively applying the setting to the displayed tool.
The input may be associated with a user, the setting comprising a user setting.
The computer system may be further adapted to display a graphical user interface icon associated with said object, and enable the selective applying of the setting to the displayed tool on selection of said icon.
The computer system may be further adapted to display a plurality of displayed objects associated with said input, and further adapted to determine the associated objects to which the step of selectively applying the setting applies to.
The computer system may be further adapted to selectively apply to the object associated with the displayed icon.
The computer system may be further adapted to selectively apply to all objects associated with the user with which the object with the displayed icon is associated.
The setting may comprise one or more of: the location of the object relative to an edge of the display; or the orientation of the object relative to an edge of the display.
The setting may comprise one or more of: the location of the object relative to a location of a user associated with the object; or the orientation of the object relative to a location of the user associated with the object.
The computer system may further be adapted to position a displayed object associated with a user on the interactive display surface, wherein the setting comprises one or more of: the location of the object relative to the position of the displayed object associated with the user; or the orientation of the object relative to the position of the displayed object associated with the user. The displayed object associated with the user may be positioned by the user.
The setting may comprise one or more of: the size of the object; the orientation of the object relative to an application or content with which it is associated; the object opacity; or the object background colour.
Allocating the setting may comprise a configuration step. The configuration step may be adapted to determine one or more default settings for the object.
Allocating the setting may be adapted to comprise a modification step.
The modification may be adapted to adjust a setting for the object, wherein selectively applying of the setting to the object is adapted to comprise applying the last known setting.
The setting may be the last known setting for modifying content of the type of the current content type.
The setting may be the last known setting for modifying the current content.
The input may be associated with a user, the displayed object being one of one or more displayed objects associated with the user, there further being displayed a graphical user interface icon associated with the user, wherein selectively applying the setting to the displayed object is enabled responsive to selection of the graphical user interface icon associated with the user, the setting being applied to all displayed objects associated with the user.
There may be displayed a further graphical user interface icon associated with the graphical user interface icon associated with the user, selectively applying being dependent on selection of the further icon.
In dependence on one or more settings being associated with a position of the user, the user' s position may be determined by the position of the graphical user icon associated with the user .
BRIEF DESCRIPTION OF THE DRAWINGS:
The invention will now be described by way of reference to the accompanying Figures in which:
Figure 1 illustrates an implementation scenario for embodiments of the invention;
Figure 2 illustrates a further implementation scenario for embodiments of the invention;
Figure 3 illustrates a further implementation scenario for embodiments of the invention;
Figures 4(a) to 4(c) illustrate a first embodiment of the invention;
Figure 5 illustrates a flow process associated with the embodiment of Figures 4(a) to 4(c);
Figures 6(a) to 6(c) illustrate a second embodiment of the invention;
Figure 7 illustrates a flow process associated with the embodiment of Figures 6(a) to 6(c);
Figures 8(a) to 8(c) illustrate a third embodiment of the invention;
Figures 9(a) to 9(d) illustrate an enhancement to embodiments of the invention; and
Figure 10 illustrates the functional elements of a computer system adapted to implement embodiments of the invention.
DESCRIPTION OF THE PREFERRED EMBODIMENTS:
The invention is now described by way of reference to various examples, embodiments, and advantageous applications. One skilled in the art will appreciate that the invention is not limited to the details of any described example, embodiment or
detail. In particular the invention may be described with reference to exemplary interactive display systems. One skilled in the art will appreciate that the principles of the invention are not limited to any such described systems.
The invention is described herein with reference to a touch sensitive interactive display surface for collaborative working. The invention is particularly described in the context of such a surface provided as a horizontal - or xtable-top' - surface, but is not limited to such a specific user arrangement.
The invention is not limited to any particular type of touch sensitive technology, nor to any particular type of display technology. In examples, the display of the touch sensitive surface may be provided by a projector projecting images onto the touch sensitive surface. In other examples the display may be provided by the touch sensitive surface being an emissive surface. Various other options exist as will be understood by one skilled in the art. In general the surface is described herein as a touch sensitive surface, which may have images projected thereon (e.g. by a projector) or which may also be an emissive display surface.
Figure 1 shows an arrangement for collaborative working at a horizontal interactive display surface, providing an exemplary implementation scenario for embodiments of the present invention .
Multiple users 6, 8, 10 are positioned around a horizontal interactive display surface 4 of a table-top type structure 2. Each user 6, 8, 10 has in front of them a working area 12, 14, 16. The working areas 12, 14, 16 may be defined by the display
of an application on the interactive display surface. In general the working area is an area in which a user is able to provide inputs at the interactive display surface, and may be a physical area within which multiple displayed items such as windows may be provided for interaction. For the purpose of the described embodiments it is assumed that the working areas 12, 14, 16 are windows associated with applications being used by the respective users.
Figure 2 shows a prior art arrangement. The user 10 has multiple work areas 16, 18, 20. This may be multiple windows associated with one or more applications, with the windows displayed on the interactive display surface 4 in front of the user 10. The user positions and orientates these individual windows 16, 18, 20 as they wish, such that the windows are displayed in various locations on the interactive display surface, overlapping or non-overlapping.
Figure 3 shows a prior art arrangement. As the user 10 caries out various tasks, the positions and sizes of the various windows 16, 18, 20 may be changed, as the user orders the windows for access.
Figures 4 (a) to 4 (c) show an adaptation in accordance with a first embodiment of the invention.
As shown in Figure 4 (a) , a displayed window 30 which the user 10 is using is adapted to include a displayed user interface icon 34 which may be represented as a button, either on or close to the window 30. The button may be displayed at an edge of the displayed window, for example in a border area 32 as shown in Figure (a) .
In the scenario of Figure 4 (a) , it may be assumed that the location, orientation, and size of the window 30 is determined by some default arrangement. This may be the default settings for the window established by its associated application, or may be customized settings established by the user 10. These settings may be established on first opening or creating the window 30, or may be established, altered or saved at any point during operation.
As shown in Figure 4 (b) , the user 10 moves the window 30 or resizes it, so that it is moved, re-positioned or re-shaped relative to its original position.
The user then selects the button by touching the displayed user interface icon 34. As shown in Figure 4(c), the displayed window 30 then returns to its original position, orientation and size as in Figure 4(a) .
There is thus provided a user interface icon which allows a user to apply default or previously determined settings to a displayed window, to allow its position, shape or orientation, or any other configurable setting associated with the window, especially a display setting, to be determined to an expected setting .
Figure 5 illustrates an exemplary flow process for a method associated with the illustrative embodiment of Figures 4 (a) to 4(c) .
In a step 100 a window is opened or established having a set of default display settings. These will determine, for example: the position of the window relative to a reference point on the display; the orientation of the window relative to
a reference point on the display; and the size of the window relative to a reference point.
In a step 102, at some subsequent time after opening or creating the window it is determined whether a user selects an option to adjust the settings. If no such request is received, then in a step 108 the original default settings are set as custom settings.
If a request is received in step 102, in a step 104 the user adjusts the required settings, and then in a step 106 the user defined settings are stored. In step 108 these are then stored as the custom settings.
In a step 110, as long as the window is active monitoring takes place to determine whether the user selects the option to apply the custom settings.
If a request is received, then in step 112 the custom settings are applied.
The process of Figure 5 is consistent with the example inventive arrangement of Figures 4(a) to 4(c).
Figures 6(a) to 6(c) show an adaptation of the arrangement of Figures 4(a) to 4(c) in accordance with a second embodiment of the invention.
As shown in Figure 6(a), the user is associated with, and using, active window 30 of Figures 4(a) to 4(c), and additionally a further window 36. The displayed window 36 is adapted similar to window 30 to include a displayed user interface icon 40 which may be represented as a button, either on or close to each of the windows. The button 40 may be
displayed at an edge of the displayed window 36, for example in a border area 38.
As shown in Figure 6(b), the user moves the windows 30 and 36 or resizes them, so that they are moved, re-positioned or reshaped relative to their original position.
The user then selects the button 34 or 40 displayed on either one of the windows 30 or 36 by touching the displayed user interface icon.
As shown in Figure 6(c), the displayed windows 30 and 36 both then return to their original position, orientation and size .
In an arrangement such as Figures 6(a) to 6(c), as shown each window may have a button associated therewith. The selection of the button in one window may result in all windows associated with a user with which that window is associated with reverting to their original positions, orientation and size. This may require there to be a known association between each window and a user, or alternatively there could be provided a defined association between multiple windows, which may for example be defined by a user.
Figure 7 illustrates an exemplary flow process for a method associated with the illustrative embodiment of Figures 6(a) to 6(c) .
In a step 200 a window is opened or established having a set of default display settings. These will determine, for example: the position of the window relative to a reference point on the display; the orientation of the window relative to
a reference point on the display; and the size of the window relative to a reference point.
In a step 202, at some subsequent time after opening or creating the window it is determined whether a user selects an option to adjust the settings. If no such request is received, then in a step 108 the original default settings are set as custom settings.
If a request is received in step 202, in a step 204 the user adjusts the required settings, and then in a step 206 the user defined settings are stored. In step 208 these are then stored as the custom settings.
In a step 214 it is determined whether a user wishes to associate this window with any other window. Whilst in embodiments such an association may be made automatically based on an association between all windows associated with a given user, in this described arrangement a user is able to selectively make such associations.
If an association is to be made, then in a step 216 an identification is made of any other window, and in step 218 the association between those windows is stored.
After step 218, or if no association is required in step 214, then in a step 210, as long as the window is active monitoring takes place to determine whether the user selects the option to apply the custom settings .
If a request is received, then in step 211 it is determined if the request is associated with a window that has an association with one or other windows. If not, then in step 212 the custom settings are applied.
If an association is determined in step 211, then in step 220 the associations are retrieved, and in step 22 the custom settings applied to all associated windows. This will apply the custom settings for each individual window to that individual window .
The process of Figure 7 is consistent with the example inventive arrangement of Figures 6(a) to 6(c) .
In an alternative arrangement where more than one window is associated with a user, only one of the windows may be associated with the display of a button. Thus a plurality of windows may be associated with a user through some mechanism, such as the user selecting the windows as 'belonging' to them or some mechanism for otherwise associating the user with a window, and then responsive to the selection of a button on one displayed window, all windows may have custom settings applied. This could be particularly advantageous, for example, where multiple windows are sub-windows of a main application window. In such an arrangement a process similar to that of Figure 7 applies, but only one window has an actual display of a button.
In a further alternative arrangement selection of a button on a window may result in only that window being repositioned, re-orientated or resized.
Figures 8(a) to 8(c) show an adaptation in accordance with a third embodiment of the invention.
As shown in Figure 8(a), the displayed window 30 which the user 10 is using is not adapted as in Figure 4 (a) , but an appropriate user interface icon 31, such as a button, is displayed proximate the window 30. Preferably, as shown in
Figure 8 (a) , the user interface icon 31 is located proximate an edge of the interactive display surface 4, near to the user 10.
As shown in Figure 8 (b) , the user moves the window 30 or resizes it, so that it is moved, re-positioned or re-shaped relative to its original position. The displayed button 31, however, is not moved or changes responsive to movement of the window 30.
The user 10 then selects the button by touching the displayed user interface icon 31. As shown in Figure 8(c), the displayed window 30 then returns to its original position, orientation and size.
In this arrangement, the displayed button 31 may be a so- called token associated with a user. On selection of the displayed button, any displayed window associated with that user is reset to its default configurations for position, size and orientation .
A particular window may be associated with the displayed user icon 31 by an association process controlled by the user 10. Alternatively an association may be created by some other mechanism. For example an association may be formed by touching the displayed user interface icon 31 and the window (or a part of the window) 30 at the same time, or in a particular sequence. A menu may be associated with the displayed icon 31 to facilitate association.
The process for the arrangement of Figures 8(a) to 8(c) may be the same as the process of Figure 5. The difference between the arrangements of Figure 4 (a) to 4 (c) and Figures 8 (a) to 8 (c) is the location of the user interface icon which is used to select the function to apply the custom settings.
The custom configurations for position, size and orientation of a window may be set by a user, or may be configured automatically by an application on initialization.
In a preferred embodiment, a displayed user interface icon associated with a user is the reference or anchor point for the user's custom configurations.
With further reference to Figures 8 (a) to 8 (c) , in an embodiment the user interface icon 31 is associated with the user 10 and its position on the interactive display surface 4 is controlled by the user 10. The user 10 positions the icon 31 to represent the user' s position at the interactive display surface. The user 10 can move about the edge of a horizontal interactive display surface, for example, and move the displayed icon 31 by dragging it on the surface with a touch contact. Once the touch contact is released, the icon 31 is assumed to be in a position where the user 10 is positioned. The technique by which the icon is associated with the user 10 is outside the scope of the present invention. It is assumed, for the purpose of this application, that the icon has been given some association with the user 10. Alternatively the icon may be associated with an anonymous user, and it is assumed that any user that positions the icon 31 on the interactive display surface 4 currently 'owns' the icon 31.
In such an embodiment, various windows such as window 30 of Figures 8(a) to 8(c) are associated with the icon 31. A further enhancement to embodiments of the invention is shown in Figures 9 (a) to 9 (d) .
In Figure 9(a), the icon 31 is associated with the window 30 and the window 36. As before each window has a respective
icon 34, 40 which when selected allows the user's custom configurations to be applied. As shown in Figure 9(a), the user 10 is associated with the icon 31.
As shown in Figure 9(a) the windows 30 and 36 have their custom configurations applied. After use by the user 10, as shown in Figure 9(b), the windows 30 and 36 are moved to alternative positions, sizing and orientation.
In a particular embodiment, the user 10 then moves to a new position relative to the edge of the interactive display surface, as shown in Figure 9(c), and drags the icon 31 with them to this position adjacent the edge of the interactive display surface 4. As denoted in Figure 9(c), the positioning of the windows 30 and 36 relative to the icon 31 is retained when the icon 31 is moved: the user only has to move the icon 31 and the windows associated therewith move, retaining their respective positioning and orientation relative to the icon 31.
At a subsequent time, the user 10 selects either the button 34 or 36, and the custom configuration settings are re-applied. As shown in Figure 9(d), the custom settings are re-applied relative to a reference point determined by the position of the user interface icon 31. The windows 30 and 36 are not returned to the positions on the display they are in Figure 9(a) : they are returned to the positions they held in Figure 9(a) relative to the position of the icon 31 in Figure 9(a) .
Thus, in a preferred embodiment, a user interface icon indicative of the position of a user, and preferably positioned by the user, is the reference point by which custom configurations are set. The location and orientation of the user interface icon 31 may be an origin point for a coordinate system
in which the positions and orientations of the associated windows are defined.
In a further embodiment, the position of the user 10 relative to an edge of the interactive display surface 4 may be determined by some other means, and the custom configurations applied according to a determined position of the user 10.
The features shown in Figure 9(c), where the windows move together with the user interface icon 31, is an optional feature which may or may not be implemented in combination with the other features of Figures 9(a) to 9(d) . In an alternative embodiment, the windows do not move with if the icon 31 is moved, but the feature of positioning the windows relative to the icon 31 responsive to selection of the button to apply custom configurations may still be implemented.
Thus, for example, the user may move the icon 31 as shown in Figure 9 (c) to a new location, but the windows remain in the same place as shown in Figure 9(b) . The user then selects the button to apply custom configurations, and the winows are retored to positions relative to the icon 31 as shown in Figure 9(d) .
With reference to Figure 10 there is illustrated the functional components of a computing system associated with an interactive display system required to support the functionality associated with the embodiments of the invention. Only those components that are relevant to understanding the invention and its embodiments are illustrated. One skilled in the art will appreciate that additional functional elements will be required to fully implement an appropriate computing system. The described functional components may be implemented in a variety
of ways, for example on a computer system associated with the interactive display surface, such as a personal computer, laptop computer, or other portable of handheld computing device, or integrated with the interactive display surface itself.
The computing system includes a processor 302 including a default settings configuration module 320 and a default settings detection module 318; a memory 304 including a displayed object module 320 and a displayed object settings module 322; an interactive display surface driver 306; an interactive display surface detection circuit 308; a display driver 310; a displayed object generator 312; a coordinate detection circuit 314; and a mapping circuit 316.
The interactive display surface driver 306 receives data representing detected inputs (such as touch inputs) at the interactive display surface, and forwards such inputs to the interactive display surface detection circuit 308.
The interactive display surface input detection circuit 308 provides the received input data to the coordinate detection circuit 314 to determine coordinate information for each input, such as each touch input.
The coordinate detection circuit 314 provides coordinate information associated with each detected input to the interactive display surface detection circuit 308, and the mapping circuit 316.
The interactive display surface input detection circuit provides the detected input data together with coordinate information to the processor 302.
The memory 304 includes the displayed objects memory module 320 for storing data associated with objects displayed on the interactive display surface. This may include information
defining the location of a displayed object, the identity of a displayed object, and other information associated with a displayed object such as its display characteristics, including its shape, colour, size, etc.
The memory 304 also includes the displayed objects settings module 302 adapted to store the custom settings, and the default settings, of any displayed object.
The displayed objects module 320 is connected to the displayed objects generator 312 to provide information on objects to be displayed. The displayed objects generator 312 is connected to the display driver 310, which generates display drive signals to the interactive display surface for displaying active objects.
The displayed objects settings module 322 stores the custom settings for any object as required.
The mapping circuit receives signals from the coordinate detection circuit 314 and the displayed objects module 320 to provide information to the processor 302 identifying when a received input coincides with a displayed object.
The displayed object custom settings are set under the control of the default settings configuration module 320 of the processor 302. Responsive to the processor receiving an input signal from the interactive display surface detection circuit 308 that a received input is associated with configuration of custom settings for a displayed object, the input is directed by the processor to the default settings configuration module 320, and the displayed object settings module 322 is updated under control of the module 322.
The settings are accessed under the control of the default settings detection module 318 of the processor 302. Responsive
to the processor 302 receiving an input signal from the interactive display surface detection circuit 308 that a received input is associated with a request for custom settings for a displayed object, the input is directed by the processor to the default settings detection module 320, and the displayed object settings module 322 is enabled under control of the module 322 to provide the custom settings to the displayed objects module 320.
One skilled in the art will appreciate that the arrangement of Figure 10 is exemplary, and a computer system may be modified, controlled or adapted in various ways to support the functionality of the present invention.
All examples and embodiments described herein may be combined in various combinations, and are not mutually exclusive .
The invention has been described herein by way ,of reference to particular examples and exemplary embodiments. One skilled in the art will appreciate that the invention is not limited to the details of the specific examples and exemplary embodiments set forth. Numerous other embodiments may be envisaged without departing from the scope of the invention, which is defined by the appended claims.
Claims
1. A method for an interactive display surface adapted to detect an input, the method comprising: allocating a setting to a displayed object associated with an input; and selectively applying the setting to the displayed tool.
2. A method according to claim 1 wherein the input is associated with a user, the setting comprising a user setting.
3. A method according to claim 1 or claim 2 further comprising displaying a graphical user interface icon associated with said object, the step of selectively applying the setting to the displayed tool being enabled on selection of said icon.
4. A method according to any one of claims 1 to 3 wherein there is provided a plurality of displayed objects associated with said input, the method further comprising determining the associated objects to which the step of selectively applying the setting applies to.
5. A method according to claim 4 when dependent upon claim 3 wherein the step of selectively applying applies to the object associated with the displayed icon.
6. A method according to claim 4 when dependent upon claim 3 wherein the step of selectively applying applies to all objects associated with the user with which the object with the displayed icon is associated.
7. A method according to any preceding claim wherein the setting comprises one or more of: the location of the object relative to an edge of the display; or the orientation of the object relative to an edge of the display.
8. A method according to any preceding claim wherein the setting comprises one or more of: the location of the object relative to a location of a user associated with the object; or the orientation of the object relative to a location of the user associated with the object.
9. Δ method according to any preceding claim further comprising positioning a displayed object associated with a user, wherein the setting comprises one or more of: the location of the object relative to the position of the displayed object associated with the user; or the orientation of the object relative to the position of the displayed object associated with the user.
10. A method according to claim 9 wherein the displayed object associated with the user is positioned by the user.
11. A method according to any preceding claim wherein the setting comprises one or more of: the size of the object; the orientation of the object relative to an application or content with which it is associated; the object opacity; or the object background colour.
12. A method according to any preceding claim wherein the step of allocating the setting comprises a configuration step.
13. A method according to claim 12 wherein the configuration step determines one or more default settings for the object.
14. A method according to any one of claims 1 to 11 wherein the step of allocating the setting comprises a modification step.
15. A method according to claim 14 wherein the modification step adjusts a setting for the object, wherein the step of
selectively applying the setting to the object comprises applying the last known setting.
16. A method according to claim 15 wherein the setting is the last known setting for modifying content of the type of the current content type.
17. A method according to claim 15 wherein the setting is the last known setting for modifying the current content.
18. A method according to any preceding claim in which the input is associated with a user, the displayed object being one of one or more displayed objects associated with the user, there further being displayed a graphical user interface icon associated with the user, wherein the step of selectively applying the setting to the displayed object is enabled responsive to selection of the graphical user interface icon associated with the user, the setting being applied to all displayed objects associated with the user.
19. A method according to claim 18 wherein there is displayed a further graphical user interface icon associated with the graphical user interface icon associated with the user, the step of selectively applying being dependent on selection of the further icon.
20. A method according to claim 18 or 19, wherein in dependence on one or more settings being associated with a position of the user, the user's position is determined by the position of the graphical user icon associated with the user.
21. A computer system for an interactive display surface adapted to detect an input, and further adapted to: allocate a
setting to a displayed object associated with an input; and selectively applying the setting to the displayed tool.
22. A computer system for an interactive display surface according to claim 21 wherein the input is associated with a user, the setting comprising a user setting.
23. A computer system for an interactive display surface according to claim 21 or claim 22 further adapted to display a graphical user interface icon associated with said object, and enable the selective applying of the setting to the displayed tool on selection of said icon.
24. A computer system for an interactive display surface according to any one of claims 21 to 23 adapted to display a plurality of displayed objects associated with said input, and further adapted to determine the associated objects to which the step of selectively applying the setting applies to.
25. A computer system for an interactive display surface according to claim 24 when dependent upon claim 23 adapted to selectively apply to the object associated with the displayed icon .
26. A computer system for an interactive display surface according to claim 24 when dependent upon claim 23 adapted to selectively apply to all objects associated with the user with which the object with the displayed icon is associated.
27. A computer system for an interactive display surface according to any one of claims 21 to 26 wherein the setting comprises one or more of: the location of the object relative to an edge of the display; or the orientation of the object relative to an edge of the display.
28. A computer system for an interactive display surface according to any one of claims 21 to 27 wherein the setting comprises one or more of: the location of the object relative to a location of a user associated with the object; or the orientation of the object relative to a location of the user associated with the object.
29. A computer system for an interactive display surface according to any one of claims 21 to 28 further adapted to position a displayed object associated with a user on the interactive display surface, wherein the setting comprises one or more of: the location of the object relative to the position of the displayed object associated with the user; or the orientation of the object relative to the position of the displayed object associated with the user.
30. A computer system for an interactive display surface according to claim 29 wherein the displayed object associated with the user is positioned by the user.
31. A computer system for an interactive display surface according to any one of claims 21 to 30 wherein the setting comprises one or more of: the size of the object; the orientation of the object relative to an application or content with which it is associated; the object opacity; or the object background colour.
32. A computer system for an interactive display surface according to any one of claims 21 to 31 wherein allocating the setting comprises a configuration step.
33. A computer system for an interactive display surface according to claim 32 wherein the configuration step is
adapted to determine one or more default settings for the obj ect .
3 . A computer system for an interactive display surface according to any one of claims 21 to 33 wherein allocating the setting is adapted to comprise a modification step.
35. A computer system for an interactive display surface according to claim 34 wherein the modification is adapted to adjust a setting for the object, wherein selectively applying of the setting to the object is adapted to comprise applying the last known setting.
36. A computer system for an interactive display surface according to claim 35 wherein the setting is the last known setting for modifying content of the type of the current content type.
37. A computer system for an interactive display surface according to claim 53 wherein the setting is the last known setting for modifying the current content.
38. A computer system for an interactive display surface according to any one of claims 21 to 37 in which the input is associated with a user, the displayed object being one of one or more displayed objects associated with the user, there further being displayed a graphical user interface icon associated with the user, wherein selectively applying the setting to the displayed object is enabled responsive to selection of the graphical user interface icon associated with the user, the setting being applied to all displayed objects associated with the user.
39. A computer system for an interactive display surface according to claim 38 wherein there is displayed a further graphical user interface icon associated with the graphical user interface icon associated with the user, selectively applying being dependent on selection of the further icon.
40. A computer system for an interactive display surface according to claim 38 or 39, wherein in dependence on one or more settings being associated with a position of the user, the user's position is determined by the position of the graphical user icon associated with the user.
41. A computer program for performing the method of any one of claims 1 to 20.
42. A computer program product for storing computer program code which, when run on a computer, performs the method of any one of claims 1 to 20.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12733032.2A EP2724224A1 (en) | 2011-06-27 | 2012-06-27 | Storing and applying optimum set-up data |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB201110842A GB2492945A (en) | 2011-06-27 | 2011-06-27 | Applying settings on an interactive display surface |
GB1110842.0 | 2011-06-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013000944A1 true WO2013000944A1 (en) | 2013-01-03 |
Family
ID=44485190
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2012/062429 WO2013000944A1 (en) | 2011-06-27 | 2012-06-27 | Storing and applying optimum set-up data |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP2724224A1 (en) |
GB (1) | GB2492945A (en) |
WO (1) | WO2013000944A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3865990A1 (en) * | 2020-02-17 | 2021-08-18 | Fujitsu Limited | Information processing apparatus, information processing program, and information processing system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070064004A1 (en) * | 2005-09-21 | 2007-03-22 | Hewlett-Packard Development Company, L.P. | Moving a graphic element |
US20090199128A1 (en) * | 2008-02-01 | 2009-08-06 | Microsoft Corporation | Arranging display areas utilizing enhanced window states |
US20090225040A1 (en) * | 2008-03-04 | 2009-09-10 | Microsoft Corporation | Central resource for variable orientation user interface |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003131779A (en) * | 2001-10-26 | 2003-05-09 | Hitachi Ltd | Multiwindow managing method |
US20100205557A1 (en) * | 2009-02-09 | 2010-08-12 | Harold Lee Peterson | System, method and computer-readable medium for clean up of visually displayed icons |
US8487888B2 (en) * | 2009-12-04 | 2013-07-16 | Microsoft Corporation | Multi-modal interaction on multi-touch display |
-
2011
- 2011-06-27 GB GB201110842A patent/GB2492945A/en not_active Withdrawn
-
2012
- 2012-06-27 WO PCT/EP2012/062429 patent/WO2013000944A1/en active Application Filing
- 2012-06-27 EP EP12733032.2A patent/EP2724224A1/en not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070064004A1 (en) * | 2005-09-21 | 2007-03-22 | Hewlett-Packard Development Company, L.P. | Moving a graphic element |
US20090199128A1 (en) * | 2008-02-01 | 2009-08-06 | Microsoft Corporation | Arranging display areas utilizing enhanced window states |
US20090225040A1 (en) * | 2008-03-04 | 2009-09-10 | Microsoft Corporation | Central resource for variable orientation user interface |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3865990A1 (en) * | 2020-02-17 | 2021-08-18 | Fujitsu Limited | Information processing apparatus, information processing program, and information processing system |
US11520453B2 (en) | 2020-02-17 | 2022-12-06 | Fujitsu Limited | Information processing apparatus, program, and system for a display capable of determining continuous operation and range determination of multiple operators operating multiple objects |
Also Published As
Publication number | Publication date |
---|---|
GB201110842D0 (en) | 2011-08-10 |
GB2492945A (en) | 2013-01-23 |
EP2724224A1 (en) | 2014-04-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10684768B2 (en) | Enhanced target selection for a touch-based input enabled user interface | |
EP2706449B1 (en) | Method for changing object position and electronic device thereof | |
EP3100151B1 (en) | Virtual mouse for a touch screen device | |
EP3267303B1 (en) | Multi-touch display panel and method of controlling the same | |
US9519369B2 (en) | Touch screen selection | |
EP2717149A2 (en) | Display control method for displaying different pointers according to attributes of a hovering input position | |
KR102307215B1 (en) | Data processing methods and electronic devices | |
EP2728456B1 (en) | Method and apparatus for controlling virtual screen | |
US11204653B2 (en) | Method and device for handling event invocation using a stylus pen | |
US20150012856A1 (en) | Electronic device and method for displaying user interface for one handed operation | |
KR102096070B1 (en) | Method for improving touch recognition and an electronic device thereof | |
KR20160035865A (en) | Apparatus and method for identifying an object | |
US10078443B2 (en) | Control system for virtual mouse and control method thereof | |
CN103279304B (en) | Method and device for displaying selected icon and mobile device | |
US10055395B2 (en) | Method for editing object with motion input and electronic device thereof | |
WO2023093661A1 (en) | Interface control method and apparatus, and electronic device and storage medium | |
JP4856136B2 (en) | Movement control program | |
WO2013000944A1 (en) | Storing and applying optimum set-up data | |
US20130067403A1 (en) | Electronic device and method for displaying user interface | |
US10795543B2 (en) | Arrangement of a stack of items based on a seed value and size value | |
KR20100107611A (en) | Apparatus and method for controlling terminal | |
JP5949705B2 (en) | Information processing apparatus, information processing apparatus control method, program, and information processing system | |
US20170031589A1 (en) | Invisible touch target for a user interface button | |
KR102205235B1 (en) | Control method of favorites mode and device including touch screen performing the same | |
US20160282950A1 (en) | Electronic device, method, and computer-readable medium for managing and controlling graphical user interfaces |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12733032 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012733032 Country of ref document: EP |