GB2451274A - A Touchscreen and One or More Conventional Displays - Google Patents

A Touchscreen and One or More Conventional Displays Download PDF

Info

Publication number
GB2451274A
GB2451274A GB0714543A GB0714543A GB2451274A GB 2451274 A GB2451274 A GB 2451274A GB 0714543 A GB0714543 A GB 0714543A GB 0714543 A GB0714543 A GB 0714543A GB 2451274 A GB2451274 A GB 2451274A
Authority
GB
United Kingdom
Prior art keywords
display
display device
touchscreen
processing device
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0714543A
Other versions
GB2451274B (en
GB0714543D0 (en
Inventor
Peter Burgers
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DisplayLink UK Ltd
Original Assignee
DisplayLink UK Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DisplayLink UK Ltd filed Critical DisplayLink UK Ltd
Priority to GB0714543.6A priority Critical patent/GB2451274B/en
Publication of GB0714543D0 publication Critical patent/GB0714543D0/en
Priority to PCT/GB2008/002536 priority patent/WO2009013499A2/en
Publication of GB2451274A publication Critical patent/GB2451274A/en
Application granted granted Critical
Publication of GB2451274B publication Critical patent/GB2451274B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0464Positioning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/04Display device controller operating with a plurality of display units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/06Consumer Electronics Control, i.e. control of another device by a display or vice versa
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Position Input By Displaying (AREA)

Abstract

Two or more display devices (14), at least comprising a touchscreen display device, are controlled by a processing device (16) and a connection network (18) connecting the display devices to the processing device. The processing device controls the touchscreen display device to display an object, whereby when the touchscreen receives a defined user input for the displayed object, the processing device is arranged to control the touchscreen display device to no longer display the object, and to control a different display device to display the object.

Description

DESCRIPTION
A SYSTEM COMPRISING A TOUCHSCREEN
AND ONE OR MORE CONVENTIONAL DISPLAYS
This invention relates to a system and to a method of operating a system. The system can provide a touchscreen with multiple displays.
In a conventional computing environment, a user will have provided a display device, a processing device, and one or more input devices such as a keyboard and mouse, which are all interconnected. However, it is becoming rncreasingly common for users to be provided with multiple display devices, for example two or three displays side-by-side. This increases the user's ease of working and allows the virtual desktop of the computer to be enlarged. As the is cost of conventional display devices decreases and the processing power of PCs increases, then the efficiency achieved by a user in such situations means that the use of a plurality of display devices in computing set-ups is increasingly available and desirable For example, International Patent Application Publication WO 2007/020408 discloses a display system that comprises a plurality of display devices, each displaying respectively an image, a data processing device connected to each display device and controlling the image displayed by each display device, and a user interface device connected to the data processing device. The system is arranged, following a defined user input at the user interface, for at least two of the display devices, to move at least a portion of the image displayed to a different display device. When carrying out the moving of the at least a portion of the image displayed to different display device, the system is arranged to scroll the portion of the image between the two display devices.
It is also known to provide touchscreens which are display devices that also include a user interface based upon position and/or pressure.
Touchscreens are used in many applications, such as ticket machines in railway stations and airports, and in user interfaces in public spaces such as museums. A technology researcher, Jeff Han, has done work in the field of multi-touch sensing, as can be seen, for example, by the video available from Youtube (www. youtube.com/watch?v=zwGAKU ForhM). Products such as the Apple's iPhone has a multi-touch sensitive screen, and Microsoft have demonstrated a multi-touch platform called "surface", see www.microsoft.com/surface. Jeff Han's work shows a virtual desktop being used to move windows (photographs in the video) around by touching and dragging the screen. Multi-touch is important in a single touch screen io environment to provide a more intuitive interface, for example, by moving two fingers apart to zoom in, move one finger round another to rotate. However there is no known solution to the problem of integrating touchscreens with conventional display devices, in such a manner that the user can access the greatest functionality from the connected display devices.
It is therefore an object of the invention to improve upon the known art According to a first aspect of the invention, there is provided a system comprising two or more display devices, at least one display device comprising a touchscreen display device, a processing device arranged to control the output of the display devices, and a connection network connecting the display devices to the processing device, wherein the processing device is arranged to control the touchscreen display device to display an object, the touchscreen display device is arranged to receive a defined user input for the displayed object, the processing device is arranged to control the touchscreen display device to no longer display the object, and the processing device is arranged to control a different display device to display the object.
According to a second aspect of the invention, there is provided a method of operating a system, said system comprising two or more display devices, at least one display device comprising a touchscreen display device, a processing device arranged to control the output of the display devices, and a connection network connecting the display devices to the processing device, the method comprising the steps of controlling the touchscreen display device to display an object, receiving a defined user input for the displayed object, controlling the touchscreen display device to no longer display the object, and controlling a different display device to display the object.
Owing to the invention, it is possible to provide a multi-touch touchscreen technology that provides an intuitive user interface for a variety of applications. Adding additional standard displays to a touchscreen-enabled system provides a much larger work surface, and a more efficient work environment. Documents, icons and objects can be "thrown" and "grabbed" between the touchscreen console and the additional displays in an intuitive io and efficient manner. In a preferred embodiment, display devices comprise a single touchscreen display device and one or more non-touchscreen display devices Using a multiple-display working environment allows more information to be displayed simultaneously, and reduces context switching" between different sets of information. Using more than one touchscreen display is an expensive and unwieldy solution to the above problem, as they all must be located within easy reach of the operator's hands. An alternative is to add additional standard displays, which are less expensive and can be positioned more conveniently.
Multi-touch touchscreen technologies (touchscreen consoles that can register multiple touch points simultaneously) allow unique solutions for working with multi-display systems. The novelty of this idea is the combination of a multi-touch touchscreen console, additional standard displays, and an intuitive method of controlling the flow of information between the touchscreen console and the other displays.
The invention provides a much larger virtual desktop area, keeping cost down by Ofliy using one touchscreen and allowing objects such as windows to be moved on or off the main (touch) screen onto external/peripheral normal screens. The user doesn't need to move from the main screen, and can look over to the external screens while always working on the main screen, giving rise to productivity improvements compared with the use of a single screen.
The invention is particularly useful for technical fields such as desktop publishing, software development, and video editing. The system provides an intuitive interface without incurring cost of a very large touch screen or multiple smaller touch screens. This also allows the user to use a larger virtual S workspace than the area presently in reach. Multi-touch screens are not necessary to implement this, thereby allowing the use of a cheaper single touch screen, but a multi-touch screen can be used to provide more intuitive control methods then known in the art.
The system can also be used in other applications away from the io standard desktop computing environment. For example, the system can be used in the retail environment for advertising purposes, for example, to control a large area (such as a whole wall) from a touchscreen at hand height. The system also has application in mobile wireless solutions, for example it can also be used with a mobile touch-sensitive device to move objects such as windows onto screens located anywhere else: in the same room, in different rooms, in different buildings, countries, etc. This could be useful for doctors carrying portable wireless touchscreen displays to demonstrate to patients or for teaching purposes.
Preferably, the processing device is arranged to control the touchscreen display device to display one or more hotspots, the or each hotspot linked to a different display device The system embodies the idea of arranging a touchscreen in the user's most easily accessible working space, and arranging normal screens around the touchscreen as desired. In one embodiment, the software run by the processing device is configured to have hotspots at the edges (preferably) in locations near the normal screens coupled to each hotspot. The defined user input comprises moving the object to a hotspot, and the processing device is arranged to control the respective different display device to display the object. Thereby, the defined user input such as dragging an object such as a window into a hotspot triggers the control software to move the window from the touchscreen to the normal screen associated with the hotspot.
Advantageously, the processing device is arranged to control the touchscreen display device to display the or each hotspot comprising one or more icons, where the or each icon is linked to an object displayed by the respective display device. The hotspots can show icons representing the objects such as windows and open documents etc., and these can be used on the touchscreen to transfer a specific object back from a normal display device associated with the specific hotspot to the touchscreen. For example, if there are three objects on the associated normal display, then these will be represented by three icons in the hotspot. If the number of objects exceeds a specified amount such that there are too many objects to be displayed as icons, then the hotspot could be configured to bring up a menu when the user touches in the area of the hotspot, and the user can select from this the icon they wish to drag away from the hotspot, thereby bringing the linked object to the touchscreen.
Is Other configurations of the hotspot are possible. If there are a number of objects on an external (non-touchscreen) display, other possibilities exist as a good way of selecting one of these to move them back to the touchscreen.
These might be a last-in-first-out algorithm that requires a drag from anywhere in the touchscreen or a predefined gesture (for example a tap, double-tap, or circular gesture, etc.) inside the hotspot, which displays a menu or set of icons or thumbnails for each of the objects on the external window. The user can then select an object by dragging it away from the hotspot. These two have the advantage that the hotspots can be kept relatively small and uncluttered (or invisible) until they need to be used. Indeed, a combination of two or more of the above options is also a useful solution.
One possibility is to maximise the object window on the normal screen, so that there is only ever one window in a normal screen at a time, thereby making it easy to move that window back into the touch screen. In this document an "object" can be anything that is visually represented as an entity such as windows, icons or other virtual objects that can be shown -basically anything which can be dragged To grab a window back from a normal screen, the user simply uses a dragging motion from the hotspot back into the main part of the touch screen. In this case, the touchscreen display device is arranged to receive a second defined user input for a displayed object, the processing device is arranged to control a display device to no longer display the object, and the processing device is arranged to control the touchscreen display device to display the object The defined second user input can comprise moving an icon from a hotspot, and the processing device is arranged to control the touchscreen display device to display the respective object.
Additionally, a concept of momentum could be provided to distinguish between moving objects to and from other screens, and moving objects into that area of the touch screen. For example, slow dragging operations could be used to drag the object into the sides of the touchscreen, and fast dragging operations used to activate the hotspot and cause the object to move to the associated normal screen.
The user interface configuration of the touchscreen can also be such that double taps could be used to switch windows, i.e. move the window having focus in the touchscreen to the normal screen associated with the tapped hotspot, and move the window in that normal screen back to the touch screen. Double taps would prevent accidental activation of this feature, although single taps would also be possible.
More than one window could be moved into a normal screen, either using a last in first out algorithm and dividing up the normal screen to fit each new window as it is added (e.g. tile horizontally, vertically, or on a grid), or maximising each window on the normal screen over the top of any existing windows (e.g. a stack of windows). In one embodiment, the most recently added window might appear larger on the desktop, with the less recent windows appearing smaller. Other alternatives will be apparent to the skilled person. Effectively, the processing device is arranged to control the different display device to display the moved object as the primary object on the new display device.
A user can access the interface to configure the hotspots, for example, to move them, resize them, change their shape, which can be supported by the use of a multi-touch screen, allowing resizing with a multi-touch gesture, changing shape (e.g. tracing the outline of a hotspot or defining the two foci in an elliptical hotspot. An interlock for preventing hotspot reconfiguring happening accidentally can be provided.
Visually, the hotspots can advantageously be implemented as shapes in the background pattern, or as more transparent or less transparent areas (for example using the Windows Vista opacity control), or completely invisible so as to maintain a normal appearance for the main touch screen. The hotspots could also be invisible during normal operation but appear when the control io method detects a hotspot operation (for example a faster drag than normal or a double tap).
Embodiments of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which:-Figure 1 is a schematic diagram of a system including multiple display devices, Figure 2 is a schematic diagram of a touchscreen display device of the system of Figure 1, Figure 3 is a further schematic diagram of the touchscreen display device, Figure 4 is a schematic diagram of a conventional display device of the system of Figure 1, and Figure 5 is a flowchart of a method of operating the system of Figure 1.
An example embodiment of a system is shown in Figure 1. The system comprising several display devices 12 and 14, one of which display devices comprising a touchscreen display device 12, and the remainder comprising standard LCD displays 14, which do not have any touchscreen functionality.
The touchscreen display device 12 is a device that, in addition to having normal display functionality, can also receive user input from one or more concurrent touches to its screen, as a form of user input. The system 10 also includes a processing device 16 arranged to control the output of the display devices 12 and 14, and a connection network 18 (using a common connection standard such as USB or Ethernet) connecting the display devices 12 and 14 to the processing device 16.
The system 10 is designed to harness the functionality of the display and touchscreen components in such a way that the user can easily manipulate the components that are being displayed by any of the display devices 12 and 14, without having to use complex interactions through a keyboard and/or a mouse. In the example of Figure 1, the user has access to the touchscreen display 12 and three further conventional display devices 14, but the system 10 could be configured with just a single touchscreen 12 and a single display 14, or with a larger number of displays 14.
The connection configuration used to join the components is not material to the system. The touchscreen 12 could have the processing component 16 in-built and connect to the conventional displays 14 directly via standards such as VGA, DVI, HDMI or a combination of these, or using general purpose local networks such as Ethernet, or indirectly via wide area networks such as the Internet. Indeed the display devices 14 need not be located within the same room as the touchscreen 12, but the example of Figure 1 is based around the conventional desktop computing environment familiar to all computer users.
As the user interacts with the system 10, a variety of objects will be shown by the individual display devices 12 and 14. These objects will be such things as windows and icons, as is currently conventional in desktop computing. The term object is used in a very general sense to describe any element that can be displayed and perceived by the user. As the display area of conventional display devices increases it is common for users to have multiple objects shown on a display device at one time, but this can lead to screen clutter and difficulty in maintaining efficiency, especially in situations where a user wishes to view two full-screen sized objects at the same time.
This is the principal behind using multiple display devices 12 and 14, as shown in Figure 1.
The system 10 provides a process for handling the objects displayed by the touchscreen display device 12. The processing device 16 is arranged to control the touchscreen display device 12 to display an object. When the touchscreen display device 12 receives a defined user input for the displayed object, the processing device 16 is arranged to control the touchscreen display device 12 to no longer display the object, and the processing device 16 is arranged to control a different display device 14 to display the object.
This concept is illustrated in more detail, with reference to Figure 2. The processing device 16 is arranged to control the touchscreen display device 12 to display one or more hotspots 20, each hotspot 20 linked to a different display device 14. The touchscreen display device 12 is displaying an object 22. A single object 22 is shown for clarity purposes, but it will be appreciated that in practical implementations of the system 10, the touchscreen display device 12 will be displaying multiple objects 22, some of which may be overlapping.
The user's hand and finger are shown schematically as the pointer 24, in the Figure. The defined user input comprises moving the object 22 to a hotspot 20, and the processing device 16 is arranged to control the respective different display device 14 to display the object 22. For the three external display 14 setup of Figure 1, the touchscreen display device 12 shows an example of how the hotspots 20 are configured to correspond to the position of the three display devices 14. In this example, the user is moving an object 22 to the right hand display 14 by dragging it with their finger. Once the user has completed the dragging motion, then the touchscreen display device 12 will no longer display the object 22, and it will be now displayed by the respective display 14. The sizing and position of the object 22 on the new display 14 can be configured in many different ways. One simple methodology is for the new display 14 to replicate the relative size and position of the object 22 as it was displayed by the touchscreen display device 12, before the dragging operation was performed by the user.
The reverse operation can also be performed, as shown in Figure 3.
The touchscreen display device 12 is arranged to receive a second defined user input for a displayed object 22, in response to which, the processing device 16 is arranged to control the respective display device 14 to no longer display the object 22, and the processing device 16 is arranged to control the touchscreen display device 12 to display the object 22. In this example, the user is moving the object 22 from the middle external display 14 by dragging their finger 24 from the hotspot 20 into the main area of the multi-touch panel 12. This action causes the object 22 currently displayed by the display 14 to disappear from the external screen 14, and move with the user's gesture to a position on the multi-touch panel 12.
Many different implementations of the reverse operation (of bringing an object 22 back from an external display 14 to the touchscreen 12) are possible. In Figure 3, to assist the user, the processing device 16 is arranged to control the touchscreen display device 12 to display within the hotspot 20 one or more icons 26, where each icon 26 is linked to an object 22 being displayed by the respective display device 14. The defined second user input comprises moving the icon 26 from the hotspot 20, and the processing device 16 is arranged to control the touchscreen display device 12 to display the respective object 22. The icon 26 will disappear when this operation is completed.
As mentioned above, when an object 22 is "sent" by a user from the touchscreen 12 to a different display 14, then the handling of the object 22 on that new display 14 must be determined. In one embodiment, the processing device 16 is arranged to control the different display device 14 to display the object 22 as the primary object 22, on the new display 14. This is shown in Figure 4.
In this diagram there is shown an example layout of an external "normal" screen 14. The user has dragged three objects 22 and 28 onto the screen 14. This example display algorithm sizes the most recently added window much larger than the others. The object 22 is the most recent object that the user has brought to this display device 14, and is sized accordingly as the primary object 22 on the display 14. If the user were to send a further object to the display 14, then the processing device 16 would resize the ii current objects displayed and present the newer object as the primary object on the screen.
The method of operating the system 10 is summarised in Figure 5. The method comprises the steps of, firstly at step Si, controlling the touchscreen display device 12 to display the object 22, receiving (step S2) the defined user input for the displayed object 22, controlling (step S3) the touchscreen display device 12 to no longer display the object 22, and finally (step S4) controlling a different display device 14 to display the object 22.

Claims (16)

1. A system comprising * two or more display devices, at least one display device comprising a touchscreen display device, * a processing device arranged to control the output of the display devices, and * a connection network connecting the display devices to the processing device, wherein the processing device is arranged to control the touchscreen display device to display an object, the touchscreen display device is arranged to receive a defined user input for the displayed object, the processing device is arranged to control the touchscreen display device to no longer display the object, and the processing device is arranged to control a different display device to display the object.
2. A system according to claim 1, wherein said display devices comprise a single touchscreen display device and one or more non-touchscreen display devices.
3. A system according to claim 1 or 2, wherein the processing device is arranged to control the different display device to display the object as the primary object.
4. A system according to any preceding claim, wherein the processing device is arranged to control the touchscreen display device to display one or more hotspots, the or each hotspot linked to a different display device.
5. A system according to claim 4, wherein the defined user input comprises moving the object to a hotspot, and the processing device is arranged to control the respective different display device to display the object.
6. A system according to claim 4 or 5, wherein the processing device is arranged to control the touchscreen display device to display the or each hotspot comprising one or more icons, the or each icon linked to an object displayed by the respective display device.
7. A system according to any preceding claim, wherein the touchscreen display device is arranged to receive a second defined user input for a displayed object, the processing device is arranged to control a display device to no longer display the object, and the processing device is arranged to control the touchscreen display device to display the object.
8. A system according to claim 6 and 7, wherein the defined second user input comprises moving an icon from a hotspot, and the processing device is arranged to control the touchscreen display device to display the respective object.
9. A method of operating a system, said system comprising two or more display devices, at least one display device comprising a touchscreen display device, a processing device arranged to control the output of the display devices, and a connection network connecting the display devices to the processing device, the method comprising the steps of * controlling the touchscreen display device to display an object, * receiving a defined user input for the displayed object, * controlling the touchscreen display device to no longer display the object, and * controlling a different display device to display the object.
10. A method according to claim 9, wherein said display devices comprise a single touchscreen display device and one or more non-touchscreen display devices.
11. A method according to claim 9 or 10, and further comprising controUing the different display device to display the object as the primary object.
12. A method according to claim 9, 10 or 11, and further comprising wherein controlling the touchscreen display device to display one or more hotspots, the or each hotspot linked to a different display device.
13. A method according to claim 12, wherein the defined user input comprises moving the object to a hotspot, and controlling the respective different display device to display the object.
14. A method according to claim 12 or 13, and further comprising controlling the touchscreen display device to display the or each hotspot comprising one or more icons, the or each icon linked to an object displayed by the respective display device.
15. A method according to any one of claims 9 to 14, and further comprising receiving a second defined user input for a displayed object, controlling a display device to no longer display the object, and controlling the touchscreen display device to display the object.
16. A method according to claim 14 and 15, wherein the defined second user input comprises moving an icon from a hotspot, and controlling the touchscreen display device to display the respective object.
GB0714543.6A 2007-07-26 2007-07-26 A system comprising a touchscreen and one or more conventional display devices Expired - Fee Related GB2451274B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0714543.6A GB2451274B (en) 2007-07-26 2007-07-26 A system comprising a touchscreen and one or more conventional display devices
PCT/GB2008/002536 WO2009013499A2 (en) 2007-07-26 2008-07-23 A system comprising a touchscreen and one or more conventional displays

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0714543.6A GB2451274B (en) 2007-07-26 2007-07-26 A system comprising a touchscreen and one or more conventional display devices

Publications (3)

Publication Number Publication Date
GB0714543D0 GB0714543D0 (en) 2007-09-05
GB2451274A true GB2451274A (en) 2009-01-28
GB2451274B GB2451274B (en) 2013-03-13

Family

ID=38512880

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0714543.6A Expired - Fee Related GB2451274B (en) 2007-07-26 2007-07-26 A system comprising a touchscreen and one or more conventional display devices

Country Status (2)

Country Link
GB (1) GB2451274B (en)
WO (1) WO2009013499A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110004821A1 (en) * 2009-07-02 2011-01-06 Sony Corporation Information processing apparatus and information processing method
CN102754057A (en) * 2009-12-23 2012-10-24 诺基亚公司 Method and apparatus for display device
EP2691836A1 (en) * 2011-03-31 2014-02-05 SMART Technologies ULC Manipulating graphical objects in a multi-touch interactive system
FR2996912A1 (en) * 2012-10-17 2014-04-18 Airbus Operations Sas DEVICE AND METHOD FOR REMOTE INTERACTION WITH A DISPLAY SYSTEM
EP2738669A1 (en) * 2012-11-29 2014-06-04 BlackBerry Limited System and method for graphic object management in a large display area computing device
EP2880518A4 (en) * 2012-08-01 2016-03-02 Google Inc Sharing a digital object
US9513795B2 (en) 2012-11-29 2016-12-06 Blackberry Limited System and method for graphic object management in a large-display area computing device
EP3731068A4 (en) * 2017-12-19 2021-05-12 Sony Corporation Information processing system, information processing method, and program

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9182937B2 (en) 2010-10-01 2015-11-10 Z124 Desktop reveal by moving a logical display stack with gestures
US8930846B2 (en) 2010-10-01 2015-01-06 Z124 Repositioning applications in a stack
JP5792424B2 (en) 2009-07-03 2015-10-14 ソニー株式会社 MAP INFORMATION DISPLAY DEVICE, MAP INFORMATION DISPLAY METHOD, AND PROGRAM
US8749484B2 (en) 2010-10-01 2014-06-10 Z124 Multi-screen user interface with orientation based control
US20130132885A1 (en) * 2011-11-17 2013-05-23 Lenovo (Singapore) Pte. Ltd. Systems and methods for using touch input to move objects to an external display and interact with objects on an external display
US10768803B2 (en) 2015-09-21 2020-09-08 Motorola Solutions, Inc. User interface system with active and passive display spaces
EP3472806A4 (en) 2016-06-17 2020-02-26 Immersive Robotics Pty Ltd Image compression method and apparatus
US11429337B2 (en) 2017-02-08 2022-08-30 Immersive Robotics Pty Ltd Displaying content to users in a multiplayer venue
AU2018280337B2 (en) 2017-06-05 2023-01-19 Immersive Robotics Pty Ltd Digital content stream compression
TW201935927A (en) 2017-11-21 2019-09-01 澳大利亞商伊門斯機器人控股有限公司 Frequency component selection for image compression
CN111699693A (en) 2017-11-21 2020-09-22 因默希弗机器人私人有限公司 Image compression for digital reality
CN114327315A (en) * 2020-09-28 2022-04-12 北京小米移动软件有限公司 Display data transmission system, method, electronic device, and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6573913B1 (en) * 1997-01-27 2003-06-03 Microsoft Corporation Repositioning and displaying an object in a multiple monitor environment
US20030227423A1 (en) * 2002-06-07 2003-12-11 Nec-Mitsubishi Electronic Visual Systems Corporation Multi-display control system and image display apparatus
WO2007020408A1 (en) * 2005-08-13 2007-02-22 Displaylink (Uk) Limited A display system and method of operating a display system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545669B1 (en) * 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US7231609B2 (en) * 2003-02-03 2007-06-12 Microsoft Corporation System and method for accessing remote screen content

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6573913B1 (en) * 1997-01-27 2003-06-03 Microsoft Corporation Repositioning and displaying an object in a multiple monitor environment
US20030227423A1 (en) * 2002-06-07 2003-12-11 Nec-Mitsubishi Electronic Visual Systems Corporation Multi-display control system and image display apparatus
WO2007020408A1 (en) * 2005-08-13 2007-02-22 Displaylink (Uk) Limited A display system and method of operating a display system

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110004821A1 (en) * 2009-07-02 2011-01-06 Sony Corporation Information processing apparatus and information processing method
CN102754057A (en) * 2009-12-23 2012-10-24 诺基亚公司 Method and apparatus for display device
EP2517091A1 (en) * 2009-12-23 2012-10-31 Nokia Corp. Method and apparatus for display device
EP2517091A4 (en) * 2009-12-23 2013-11-06 Nokia Corp Method and apparatus for display device
US9588673B2 (en) 2011-03-31 2017-03-07 Smart Technologies Ulc Method for manipulating a graphical object and an interactive input system employing the same
EP2691836A1 (en) * 2011-03-31 2014-02-05 SMART Technologies ULC Manipulating graphical objects in a multi-touch interactive system
EP2691836A4 (en) * 2011-03-31 2014-10-22 Smart Technologies Ulc Manipulating graphical objects in a multi-touch interactive system
EP2880518A4 (en) * 2012-08-01 2016-03-02 Google Inc Sharing a digital object
FR2996912A1 (en) * 2012-10-17 2014-04-18 Airbus Operations Sas DEVICE AND METHOD FOR REMOTE INTERACTION WITH A DISPLAY SYSTEM
US9652127B2 (en) 2012-10-17 2017-05-16 Airbus Operations (S.A.S.) Device and method for remote interaction with a display system
EP2738669A1 (en) * 2012-11-29 2014-06-04 BlackBerry Limited System and method for graphic object management in a large display area computing device
US9513795B2 (en) 2012-11-29 2016-12-06 Blackberry Limited System and method for graphic object management in a large-display area computing device
EP3731068A4 (en) * 2017-12-19 2021-05-12 Sony Corporation Information processing system, information processing method, and program
US11112961B2 (en) 2017-12-19 2021-09-07 Sony Corporation Information processing system, information processing method, and program for object transfer between devices

Also Published As

Publication number Publication date
GB2451274B (en) 2013-03-13
WO2009013499A2 (en) 2009-01-29
GB0714543D0 (en) 2007-09-05
WO2009013499A3 (en) 2009-03-12

Similar Documents

Publication Publication Date Title
GB2451274A (en) A Touchscreen and One or More Conventional Displays
US9804761B2 (en) Gesture-based touch screen magnification
EP3591509B1 (en) Split-screen display method and apparatus, and electronic device thereof
JP5580694B2 (en) Information processing apparatus, control method therefor, program, and storage medium
US20100293501A1 (en) Grid Windows
KR20140087727A (en) a method and an apparatus for dual display
US20120169623A1 (en) Multi-Touch Integrated Desktop Environment
EP2657811A1 (en) Touch input processing device, information processing device, and touch input control method
JP2010238148A (en) User interface apparatus and mobile terminal apparatus
WO2012133272A1 (en) Electronic device
JP5848732B2 (en) Information processing device
JP2011070609A (en) Information terminal device with touch panel, method and program for controlling display
JP2003330591A (en) Information processing unit and method for operating computer
US20140195935A1 (en) Information processing device, information processing method, and information processing program
US9262005B2 (en) Multi-touch integrated desktop environment
EP2661671B1 (en) Multi-touch integrated desktop environment
JP6130583B1 (en) Computer program for controlling objects in virtual space and computer-implemented method
WO2020087303A1 (en) Terminal device, graphical user interface thereof and multi-task interactive control method
WO2020087304A1 (en) Terminal device and graphical user interface thereof, and multi-task interactive control method
KR101467144B1 (en) Method for Switching 3-Dimension Page Screen by Round Shape Touch Input
JP5918902B2 (en) Remote display area with input lens that draws each area of the graphical user interface
US20150100912A1 (en) Portable electronic device and method for controlling the same
JP5767378B1 (en) Computer program for controlling objects in virtual space and computer-implemented method
WO2020087301A1 (en) Terminal device, graphical user interface, and multi-tasking interactive control method
WO2020087299A1 (en) Terminal device and graphical user interface thereof, and multi-task interactive control method

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20210726