GB2536090A - Multi-user information sharing system - Google Patents

Multi-user information sharing system Download PDF

Info

Publication number
GB2536090A
GB2536090A GB1519274.3A GB201519274A GB2536090A GB 2536090 A GB2536090 A GB 2536090A GB 201519274 A GB201519274 A GB 201519274A GB 2536090 A GB2536090 A GB 2536090A
Authority
GB
United Kingdom
Prior art keywords
user
touch
platform
users
touch platform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1519274.3A
Other versions
GB201519274D0 (en
Inventor
Panjalingam Pillay Venkateshwara
Maroothynaden Jason
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Collaboration Platform Services Pte Ltd
Original Assignee
Collaboration Platform Services Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Collaboration Platform Services Pte Ltd filed Critical Collaboration Platform Services Pte Ltd
Publication of GB201519274D0 publication Critical patent/GB201519274D0/en
Publication of GB2536090A publication Critical patent/GB2536090A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/34User authentication involving the use of external additional devices, e.g. dongles or smart cards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A multi-touch platform 102A comprises a touchscreen 108A configured to display a common workspace 109A for use by multiple users 114 to collaborate on one or more data items in a communication session. The multi-touch platform displays, for each of the users, a user interface 140 in the common workspace, the user interface indicating where each user is positioned at the platform and comprising selectable options for each user to access and work on at least one of the data items. The multi-touch platform detects in the common workplace, one or more touchscreen gestures associated with performing one or more operations on at least one of the data items. The multi-touch platform processes the data items according to the operations and displays those data items in the common workspace based on the touchscreen gestures. The touchscreen may be embedded in a table top to form a touchscreen table. The personal user interfaces 140 may include a menu interface allowing access to applications or data items. Remote users may access the communication session via a mobile device external to the multi-touch platform.

Description

MULTI-USER INFORMATION SHARING SYSTEM
Technical Field
[0001] The present invention relates to a multi-user information sharing system. In particular, the present invention relates to a multi-user information sharing system comprising a multi-touch platform for displaying a common workspace used by multiple users at the same time for collaborating with a plurality of data items.
Background
[0002] Conventional personal computers, workstations, mobile devices, monitors and multitouch-tables and multi-touch wall mounted displays are designed for operation by only one main user. Data is presented in an orientation that is readable to a single user, which is typically inconvenient for others.
[0003] For multi-touch tables, conventional display screens are rectangular such that the surrounding structure ("wrap-around" or periphery of the table) that accommodates the multi-touch table is rectangular. This lends users to convene around one edge (usually the longest) and also limits the platform to one user per edge.
[0004] Collaboration is the act of working with someone to produce something together. Collaborative working is being augmented by the use of personal computers, mobile devices to create "blended interaction" spaces. In these spaces, digital computing is blended with natural work practices and collaboration. Examples of blended interaction range from digital pen & paper and pen & multi-touch interaction to tangible displays in rooms of mixed or augmented reality.
[0005] Typically, collaborative working is a problem solving activity involving at least two or more persons or users. However, it is apparent that any person that uses their device to provide an explanation to other people or users are always exclusively collaborating with their respective devices, but the other people or users are passive observers. By this very nature, working environments are not collaborative enough, which is reflected in such scenarios by poor team performance and inefficient team based problem solving.
[0006] With the emergence of big data, it is now not possible for an individual to digest the amount of data presented. Greater emphasis is being placed on productive teamwork but current personal and even mobile computing devices have not been designed to support this activity. In such scenarios, team members are again interactive with their own respective screens (or pad of, in the case of big wall mounted displays), which does not encourage team members to collaborate with each other.
L00071 With the invention of multi-touch mobile devices, there has been a desire to find new ways of communicating in boardrooms and meeting rooms to inspire, motivate and improve team performance.
[0008] W02011/082477 describes a standalone multi touch table with a common work zone that runs one application (e.g. a lesson) for students to enter answers and confirm placement of the objects when playing puzzles, or answering questions etc. However, this stand-a-lone table does not encourage collaboration between users of the table and other users on external devices that wish to collaborate with users at the table.
[0009] US2014/0223334 describes a collaboration server for managing security and data concurrency for a digital whiteboard system that facilitates multiple simultaneous users having access to global collaboration data stored on a whiteboard database. The digital whiteboard system includes several walls with multi-touch devices that accept touch gestures as user input during a session of collaboration. Even though the whiteboard system allows collaboration between users, each user faces their wall or "drawing region" while working-this restricts the flexibility of expression and collaboration in multi-user scenarios.
[0010] Over the past few years, there has been considerable interest and desire in enhancing collaboration between teams and allowing users to efficiently collaborate with each other in a natural way using collaborative platforms and systems, which is an ongoing problem.
poi 1] The embodiments described below are not limited to implementations which solve any or all of the disadvantages of known collaborative platforms and systems.
Summary
[0012] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
[0013] Methods, multi-touch platforms, mobile devices and servers are provided for a plurality of users to collaborate on a plurality of data items or a set of data items in a communication session with a multi-touch platform. The set of data items are uploaded from a server to the multi-touch platform. The multi-touch platform includes a processor and memory connected to a touch screen that is configured to display a common workspace for use by multiple users positioned at the multi-touch platform. The multi-touch platform displays, for each of the multiple users, a user interface indicating where each user is positioned at the platform and comprising selectable options for each user to access and work on at least one of the data items.
[0014] The multi-touch platform detects, from a touch screen associated with the multi-touch platform, one or more touch screen gestures in the common workplace from users positioned at the multi-touch platform, the one or more gestures associated with performing one or more operations on at least one of the data items. The multi-touch platform processes the data items according to the one or more operations based on the touch screen gestures, and displays the data items in the common workspace based on the corresponding touch screen gesture. The multiple users use the common workplace to perform further work on the set of data items.
[0015] According to a first aspect of the invention, there is provided a method for a plurality of users to collaborate on a set of data items in a communication session of a multi-touch platform, the multi-touch platform comprising a touch screen configured to display a common workspace for use by multiple users at the multi-touch platform. The method, performed by the platform, comprising: displaying, for each of the users at the multi-touch platform, a user interface in the common workspace, the user interface indicating where each user is positioned at the platform and comprising selectable options for each user to access and work on at least one of the data items; receiving, by the touch screen, one or more touch screen gestures in the common workspace from users positioned at the multi-touch platform, wherein the one or more touch screen gestures are associated with performing one or more operations on at least one of the data items; processing each of the one or more data items according to the operations of each associated received touch screen gesture; and displaying the data items in the common workspace based on the corresponding touch screen gesture, wherein the one or more users use the common workspace to perform further work on the set of data items.
[0016] Preferably, the method includes, in response to a corresponding touch screen gesture associated with one or more data items, transferring one or more data items from the multi-touch platform to a device associated with a user of the plurality of users.
[0017] Preferably, the method further comprising receiving, by the multi-touch platform from a device associated with one of the plurality of users, one or more further data items for the one or more users to work on in the common workspace.
[0018] Preferably, the plurality of users may include one or more local users positioned at the multi-touch platform and one or more remote users remotely accessing the communication session via one or more devices.
L00191 Preferably, the device associated with the user is a mobile device of the user, wherein the mobile device is configured to access the communication session and comprises a mobile touch screen configured for displaying a user workspace associated with the communication session, the user workspace configured for the user to perform one or more further operations on one or more of the data items.
[0020] Preferably, the device associated with the user is a further multi-touch platform associated with the user, the further multi-touch platform including a touch screen configured to display a common workspace and a user interface associated with that user in the common workspace of the further multi-touch platform.
[0021] Preferably, at least one of the selectable options of the user interface is for executing an application for accessing at least one of the data items; detecting one or more users selecting said at least one selectable option and executing the application associated with said at least one selectable option in the common workspace.
[0022] Preferably, the method includes displaying, for each of the users that are remote to the multi-touch platform, a user interface in the common workspace, the user interface for the remote user indicating the status of the remote user.
[0023] Preferably, the method further including detecting, on the touch screen associated with the multi-touch platform, one of the touch screen gestures to be a touch screen gesture for transferring one or more of the data items from a first user of the multiple users to a second user of the multiple users; determining a direction of the detected touch screen gesture; identifying the direction of the touch screen gesture is in a direction associated with the second user; and transferring the one or more data items to the second user during the communication session.
[0024] Preferably, detecting the touch screen gesture for transferring the one or more data items further comprises detecting the first user touching the touch screen with an object of the first user in the vicinity of said one or more data items and, prior to the first user removing the object from the touch screen, detecting the object moving in a direction associated with the second user.
[0025] Preferably, determining the direction associated with the second user comprises, determining, from the touch screen gesture, a vector comprising a direction pointing towards the second user, wherein the second user is located at the multi-touch platform.
[0026] Preferably, determining the direction associated with the second user further comprises determining the direction of the touch screen gesture is towards an indicator associated with the second user.
L00271 Preferably, the direction associated with the second user may be based on releasing the object from the touch screen when the object is in the vicinity of an indicator associated with the second user.
[0028] Preferably, the object intersects the indicator associated with the second user. As an option, the indicator associated with the second user is a user interface on the touch screen representing the second user.
[0029] Preferably, the multi-touch platform comprising a communication interface for communicating with a device associated with the second user, wherein transferring the one or more data items to the second user further comprises transferring the one or more data items via the communication interface to the device associated with the second user.
[0030] Preferably, the first user is also the second user with a device external to the multi-touch platform, the method comprising: determining the direction of the detected touch screen gesture comprises determining the direction of the detected touch screen gesture is a direction associated with the first user; and transferring the one or more data items further comprises transferring the one or more data items to the device of the first user.
[0031] Preferably, the method may include receiving a request from a user to join the communication session, authenticating the user prior to joining the communication session, and, on authenticating the user, allocating and displaying a user interface associated with the user on the common workspace.
[0032] Preferably, the touch screen is further configured to display a common graphical object, wherein authenticating further comprises: detecting a user using a touch screen gesture to move the common graphical object to a position on the common workspace; authenticating the user via said common workspace; allocating a user interface to the authenticated user for the communication session; and displaying the user interface in the common workspace.
[0033] Preferably, the multi-touch platform may include a plurality of designated locations spaced around the periphery of the multi-touch platform, the method further comprising: determining, from the request, that the authenticated user requests to be placed at a designated location, allocating one of the designated locations to the authenticated user, and displaying the user interface at a position in the common workspace in the vicinity of the designated location.
[0034] Preferably, the method further including: determining, from the request, that the authenticated user selects a location by moving the common graphical object to the position in the common workspace, allocating a user interface to the authenticated user, and displaying the user interface at the vicinity of the selected position in the common workspace.
[0035] Preferably, the method further including: determining the user requests to join the communication session via a mobile device external to the multi-touch platform; associating the mobile device address with the authenticated user and the user interface associated with the authenticated user; and displaying the user interface for the user of the mobile device in the common workspace, wherein the user interface for the user of the mobile device indicates the status of the remote user.
[0036] Preferably, the method further including: determining the user requests to join the communication session via another multi-touch platform; associating the another multi-touch platform address with the authenticated user and the user interface associated with the user; and displaying the user interface in the common workspace, wherein the user interface for the user indicates the status of the users at the other multi-touch platform.
[0037] Preferably, the method further including: initiating the communication session on request by a first user of the plurality of users for starting the collaboration using the multi-touch platform; and providing users of the communication session access to the set of data items via the multi-touch platform, wherein the set of data items are associated with the first user.
[0038] Preferably, the multi-touch platform may be configured to wirelessly communicate with one or more devices. Preferably, at least one of the plurality of users has a mobile device, the method further including: detecting that the mobile device of said at least one user is in the presence of the multi-touch platform; connecting to the mobile device of said at least one user for authenticating said at least one user and to log the at least one user into the multi-touch platform for joining the communication session. Preferably, the mobile device may be a wearable device and the multi-touch platform wirelessly communicates with the mobile device using near far communication signals.
[0039] Preferably, the multi-touch platform further comprises a mechanism for rotating the touch screen associated with the multi-touch platform around an axis in the plane of the touch screen from a first position to a second position, the method further comprising: detecting an input from a user to move the touch screen associated with the multi-touch platform from the first position to the second position; rotating the touch screen from the first position to the second position around the axis.
[0040] Preferably, the axis in the plane of the touch screen divides the touch screen into a first and a second portion, wherein the mechanism for rotating the touch screen associated with the multi-touch platform rotates the first portion of the touch screen from a first position to a second position around the axis.
[0041] Preferably, the method further including: detecting whether multiple users of the plurality of users are to work on a data item of the set of data items substantially simultaneously; providing each of the multiple users with a copy of the data item; receiving one or more of the copies of the data items worked on by the multiple users; and determining the changes between each copy of the data item with the original data item; generating a temporary data item by merging the original data item with the changes made to the one or more copies of the data item; and verifying the changes are acceptable to the multiple users prior to finalising the temporary data item.
[0042] Preferably, the method further comprising: detecting whether multiple users of the plurality of users are to work on a data item of the set of data items substantially simultaneously; allowing each of the multiple users to work on the data item serially; and verifying the changes are acceptable to the multiple users prior to finalising the data item.
[0043] According to a second aspect of the invention, there is provided a multi-touch platform including a processor, a touch screen, a communications interface, a memory, the processor is connected to the a touch screen, communications interface, and the memory, wherein the memory comprises computer instructions stored thereon, which when executed by the processor, causes the processor to perform the method as described herein.
[0044] Preferably, the multi-touch platform may include the touch screen embedded in a
table top.
[0045] Preferably, the platform may further comprise a plurality of designated locations spaced around the periphery of the table top for placing one or more users, wherein each of the plurality of designated locations is identified by an indicator.
[0046] Preferably, the table top, from a plan view, comprises a rectangular shape with chamfered corners.
[0047] According to a third aspect of the invention there is provided a computer readable medium including computer instructions stored thereon, which when executed by a processor, causes the processor to perform the method(s) and/or process(es) as described herein.
[0048] According to a fourth aspect of the invention, there is provided a method for operating a server apparatus in communication with a multi-touch platform as herein described, where the server apparatus comprises storage for storing the set of data items.
L00491 Preferably, the method further includes transferring the one or more data items from a first user to a second user during the communication session.
[0060] Preferably, the second user may be one of the users with a mobile device external to the multi-touch platform, and transferring the one or more data items to the second user further comprises transferring the one or more data items to the mobile device of the second user.
[0061] Preferably, the first user may be one of the users with a mobile device external to the multi-touch platform, the method including transferring the one or more data items further comprises transferring the one or more data items to the mobile device of the first user.
[0062] Preferably, the method may further include authenticating one or more users of the plurality of users requesting to join the communication session.
[0063] Preferably, the method may further include: receiving a request to upload the set of data items to the multi-touch platform for initiating the communication session on request by a first user of the plurality of users; and uploading to the multi-touch platform the set of data items, wherein the set of data items are associated with the first user.
[0054] According to a fifth aspect of the invention there is provided a server including a processor, a communications interface, a memory, the processor is connected to the communications interface, and the memory, wherein the memory comprises computer instructions stored thereon, which when executed by the processor, causes the processor to perform the method(s) and/or process(es) as described herein.
[0065] According to a sixth aspect of the invention, there is provided a method for a mobile device to enable a user to collaborate with a plurality of users on a set of data items in a communication session with a multi-touch platform as described herein, the mobile device including a touch screen configured to display a user workspace and a communication interface for communicating with the multi-touch platform. The method, performed by the mobile device, including: detecting a touch screen gesture on the user workspace associated with one or more of the data item from the user, wherein the touch screen gesture is for transferring said one or more data items from the user to one of the plurality of users or the multi-touch platform; determining a direction of the detected touch screen gesture; identifying the direction of the touch screen gesture is associated with one of the plurality of users or the multi-touch platform; and transferring the one or more data items from the mobile device to one of the plurality of users or the multi-touch platform during the communication session.
[0056] Preferably, detecting the touch screen gesture for transferring one or more data items may further include detecting the first user touching the touch screen with an object of the first user in the vicinity of said one or more data items and, prior to the first user releasing the object from the touch screen, detecting the object moving in a direction associated with one of the plurality of users or the multi-touch platform.
[0057] Preferably, the method may further include sending a request to the multi-touch platform requesting the user joins the communication session.
[0058] Preferably, the method may further include: notifying the multi-touch platform that the user of the mobile device is in the presence of the multi-touch platform; connecting to the multi-touch platform for authenticating the user of the mobile device and to log the user into the multi-touch platform to join the communication session.
[0059] Preferably, the mobile device wirelessly communicates with the mobile device using near far communications signals.
[0060] Preferably, the method may further include: receiving a copy of a data item in response to the multi-touch platform detecting whether multiple users of the plurality of users are to work on a data item of the plurality of data items substantially simultaneously; and transmitting the copy of the data item worked on by the user of the mobile device to the multi-touch platform for generating a data item comprising the merged content of the original data item and the copy of the data item worked on by the multiple users.
[0061] Preferably, the method may further include: receiving a copy of a data item in response to the multi-touch platform detecting whether multiple users of the plurality of users are to work on a data item of the plurality of data items substantially simultaneously, wherein each of the multiple users is allowed to work on the data item serially; transmitting the copy of the data item to the multi-touch platform after the user has worked on the copy of the data item; and verifying the changes are acceptable to the user and notifying the multiple users prior to the multi-touch platform finalising the data item.
[0062] According to a seventh aspect of the invention, there is provided a mobile device including a processor, a communications interface, a memory and a touch screen, the processor is connected to the communications interface, the memory and the touch screen, wherein the memory includes computer instructions stored thereon, which when executed by the processor, causes the processor to perform the method(s) and/or process(es) as described herein.
[0063] The methods described herein may be performed by software in machine readable form on a tangible storage medium or tangible computer readable medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. Examples of tangible (or non-transitory) storage media include disks, thumb drives, memory cards etc. and do not include propagated signals. The software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
[0064] This acknowledges that firmware and software can be valuable, separately tradable commodities. It is intended to encompass software, which runs on or controls "dumb" or standard hardware, to carry out the desired functions. It is also intended to encompass software which "describes" or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
[0065] The preferred features may be combined as appropriate, as would be apparent to a skilled person, and may be combined with any of the aspects of the invention.
Brief Description of the Drawings
[0066] Embodiments of the invention will be described, by way of example, with reference to the following drawings, in which: [0067] Figure la is a schematic diagram showing an example multi-user information sharing system with a multi-touch platform and common workspace according to the invention; [0068] Figure lb(i) is a schematic diagram showing the example multi-touch platform with user interfaces for each user in the common workspace according to the invention for use with the multi-user information sharing system of figure la; [0069] Figure lb(ii) is a schematic diagram showing an example user logon process for the multi-touch platform of figure 10); [0070] Figure lb(iii) is a schematic diagram showing an example process for changing user positions at the multi-touch platform of figure lb(i); [0071] Figure lb(iv) is a schematic diagram showing another example of a user interface of the multi-touch platform of figure lb(i); [0072] Figure lb(v) is a schematic diagram showing another example multi-touch platform; [0073] Figure lb(vi) is a schematic diagram showing a further example multi-touch platform; [0074] Figure lc is a schematic diagram showing an example user interface for the multi-touch platform according to the invention; [0075] Figure 1c10) and 1d0i) are schematic diagrams showing further example multi-touch touch platforms according to the invention for use with the multi-user information sharing system of figures la-1c; [0076] Figure 1d(iii), 1d(iv), 1d(v) and 1d(vi) are schematic diagrams showing example touch screen gestures for multi-touch platforms and mobile devices of figures 1a-100; [0077] Figure le is a schematic diagram showing another example of multi-user information sharing system of figure la; [0078] Figure 1f(i) is a schematic diagram showing an example multi-touch platform control system for use in the example multi-user information sharing system of figures la-1e; [0079] Figure 1f(ii) is a schematic diagram showing an example multi-touch platform control system for use with the multi-touch platform of figure 1b(); [0080] Figures 1g(i)-(iv) are schematic diagrams showing example multi-touch platform of figure 1a(ii) with different numbers of users; [0081] Figure 2a is a schematic diagram showing an example multi-touch platform according to the invention; [0082] Figure 2b(i) is a flow diagram showing an example method for operating the multi-touch platforms of figures la-2a; [0083] Figure 2b(ii) is a flow diagram showing an example method for transferring data items between mobile devices and a multi-touch platform according to any of figures 1a-2a; [0084] Figure 2b(iii) is a flow diagram showing an example method for detecting a touch screen gesture for transferring one or more data items from an example multi-touch platform according to any of figures la-2a; [0085] Figure 2c is a flow diagram showing an example method for initiating a communication session using any of the multi-touch platforms of figures la-2a; [0086] Figure 2d is a flow diagram showing an example method of working on a file using any of the multi-touch platforms of figures la-2a; [0087] Figure 2e is a flow diagram showing an example method of working on a file using any of the multi-touch platforms of figures 1a-2a; [0088] Figure 3 is a schematic diagram of an example sewer according to the invention for use with the multi-user information sharing system; [0089] Figure 4 is a schematic diagram of an example mobile device according to the invention for use with the multi-user information sharing system; [0090] Figure 5 is a schematic diagram showing another example multi user information sharing system according to the present invention; [0091] Figure 6a(i) and (ii) are schematic diagrams showing examples of a software clients for a "multi-client multi-server" based multi-user information sharing system according to the present invention; [0092] Figure 6b is a flow diagram showing an example mobile client interfacing with an example common client of the multi-user information sharing system of figures 620) and (ii); [0093] Figure 6c(i) is a schematic diagram showing an example multi-touch platform according to the invention; [0094] Figure 6c(ii) is a schematic diagram showing another example multi-touch platform according to the invention; [0095] Figure 6d is a flow diagram showing an example common client of the multi-use information sharing system of figures 6a(i) and (ii); [0096] Figure 7 is a flow/schematic diagram showing another example "multi-client multi-server" multi-user information sharing system according to the present invention; [0097] Figures BaO) and (ii) are schematic diagrams showing an example adjustable multi-touch platform according to the invention; [0098] Figures 8b(i) and (ii) are schematic diagrams showing another example adjustable the multi-touch platform according to the invention; [0099] Figure 8c is a schematic diagram showing the components of the control mechanism for adjusting the multi-touch platform of figures 8a(i) to 8b(ii); [00100] Figure 9 is a schematic diagram of a further example multi-user information sharing system according to the invention; and [00101] Figure 10 is a schematic diagram of another example multi-user information sharing system according to the invention.
[00102] Common reference numerals are used throughout the figures to indicate similar features.
Detailed Description
[00103] Embodiments of the present invention are described below by way of example only. These examples represent the best ways of putting the invention into practice that are currently known to the Applicant although they are not the only ways in which this could be achieved. The description sets forth the functions of the example and the sequence of steps for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
[00104] The inventors have found that it is possible to enhance collaboration between team members or users by providing multiple user management, control, transfer and manipulation of data items such as file(s) using a server-client multi-sharing environment that includes a common workspace for multiple users to work on the data items. Figure la is a schematic diagram showing an example multi-user information sharing system 100 allowing users to effectively and efficiently collaborate together.
[00105] The multi-user information sharing system 100 includes multi-touch platforms 102a- 102b (e.g. multi-sharing touch screen tables, or touch screen mobile devices, etc.), a server 104 and one or more mobile devices 106a, one or more wearable devices 106b, and/or one or more input devices such as one or more stylus 106c. The multi-touch platforms 102a-102b are arranged to be in communication over a communication network (not shown) with the server 104 and the, mobile device(s) 106a, wearable devices 106c, and/or one or more stylus 106c.
[00106] The multi-touch platform 102a comprises a display device 108a (e.g. a touch screen, or multi-touch screen and the like) configured for displaying a common workspace 109a and for receiving input from one or more users of a plurality of users that may be spaced around the periphery of the common workspace 109a of the multi-touch platform 102a. The common workspace 109a is shown in figure 1a(i) to be displayed on the surface of the display device 108a, which is preferably a touch screen interface or touch screen. The common workspace 109a is shown in figure 1a(i) to cover a large proportion of the displayable and/or usable surface of the display device 108a. For example, the entire usable and/or displayable/viewable surface of display device 108a is used for displaying the common workspace 109a. Although the common workspace 109a is shown in the following examples to cover a large proportion of the displayable surface of the display device 108a, it is to be appreciated that a person skilled in the art that the common workspace 109a may be of any suitable size that allows multiple users to work on data items in the common workspace via the display device 108a. Using the common workspace 109a, multiple users of the plurality of users may collaborate and work on a set of data items (e.g. a plurality of files) during a communication session with the multi-touch platform 102a. The set of data items may include at least one data item or a plurality of data items that may be stored in local storage at the multi-touch platform 102a and/or stored on one or more servers, distributed storage and/or cloud storage solutions and accessible at the multi-touch platform 102a via the communication network.
[00107] The multi-touch platform 102a may also include a communication interface for communicating over the communication network with at least one of the plurality of users with a mobile device 106a, wearable device 106b or stylus 106c external to the multi-touch platform 102a. In addition, the multi-touch platform 102a may be in communication with one or more users of the plurality of users that are located at another multi-touch platform 102b, which also includes a display device 108b such as a touch screen interface that displays a common workspace 109b in a similar manner as the common workspace 109a of the multi-touch platform 102a in which the users located at the multi-touch platform 102a can collaboratively work on the set of data items in the communication session.
[00108] In order to collaboratively work on at least one data item from the set of data items, each user may be allocated a user interface (not shown) by the multi-touch platform 102a for, among other things, accessing applications and the set of data items. The applications may be used simultaneously by the users in the common workspace 109a when working on one or more data items of the set of data items. A user interface for each user provides the advantage of the common workspace 109a being simultaneously used by the users around the multi-touch platform 102a while they are working on at least one data item from the set of data items. Each user does not have to wait on another user to complete a task in the common workspace 109a when working on one or more data items.
[00109] The user interface is displayed in the common workspace 109a at a designated location in the common workspace 109a, e.g. the locations may be spaced around the common workspace 109a allowing the users to be positioned around the multi-touch platform 102a. The multi-touch platform 102a may control the allocation of a designated location or position in the common workspace 109a during or after the user has logged onto the platform 102a. Alternatively or additionally, users may select a location or position in the common workspace 109a from which they would like to work on at least one data item from the set of data items. This allows users to position or re-position themselves around the multi-touch platform 102a closer to one or more other users to form a sub-group of users (e.g. a "huddle" of users) at the multi-touch platform 102a while simultaneously using the common workspace 109a to work on similar or the same one or more data items.
[00110] In any event, a user interface for each user positioned around the multi-touch platform 102a is displayed in the common workspace 109a. The user interface is allocated to the user and displayed in the common workspace 109a at the designated or selected location at the time the user logs onto the multi-touch platform 102a. The user interface may include a menu interface for allowing the user to access a set of applications and/or third party software and/or at least one data item from the set of data items for working on said at least one data item simultaneously using an application in the common workspace 109a. The users can simultaneously work in the common workspace 109a and simultaneously access one or more applications and/or file managers for accessing and working on at least one data item in the common workspace 109a. The user interface may be a minimalist interface that is displayed in a position in the common workspace 109a indicating where the user should be positioned at the multi-touch platform 102a and provide access to applications and at least one data item. In other examples, the user interface may include a dashboard style interface or one or more notification icons or images for notifying the user of new data items, messages, emails, conference calls, for instant messaging etc. In these examples, the user interface may be a "stripped" down interface that still allows users to access applications and one or more data items but requires the users to simultaneously use the common workspace 109a for working on one or more data items.
[00111] The common workspace 109a provides the advantage that all users around the multi-touch platform 102a can simultaneously use the common workspace 109a while working on or with at least one data item and/or on one or more data items from the set of data items. This leads to a more natural method for users to interact with electronic data and allows the users around the multi-touch platform 102a to efficiently collaborate and rapidly share information associated with the set of data items with each other and/or remote users.
Restricting the users positioned at the multi-touch platform 102a to all work simultaneously within a common workspace 109a reduces the burden and frustration that can be caused by users working in separate user workspaces. This also forces users to collaborate and prevents users from focussing on separate or "personalised" user workspaces and thus not contributing to the overall effort or decision making process at the multi-touch platform 102a.
The multi-touch platform 102a with the common workspace 109a provides an electronic collaboration forum that is an open public space that encourages collaboration between users to effectively and efficiently [00112] A communication network may comprise or represent any one or more network(s) used for enabling communication between one or more multi-touch platform(s), one or more mobile device(s), one or more wearable device(s), one or more server(s) and/or other computing system(s), device(s) and platform(s) and the like that may be connected and/or have access to the network. The communication network infrastructure may also comprise or represent any one or more network(s), access points, one or more network nodes, entities, elements, application servers, servers, base stations, ground stations, gateway earth stations, satellites, gateway servers, policy nodes, network infrastructure equipment or other network devices or elements that are linked, coupled or connected to form a communication network. The coupling or links between network nodes, access points and other network infrastructure equipment may be wired or wireless (for example, Wi-Fi communication links, radio communications links, optical fibre, etc.). The communications network and network infrastructure may include any suitable combination of core network(s) and radio access network(s) including network nodes or entities, base stations, ground stations, gateway earth stations, satellites, gateway servers, access points, etc. that enable communications between the one or more multi-touch platform(s), mobile device(s), wearable device(s), server(s) and/or other computing system(s), device(s) and platform(s) and the like, network nodes of the communication network and network infrastructure 104 and/or other devices connecting and/or accessing the network.
[00113] Examples of a communication network or network that may be used in certain embodiments of the described apparatus, methods and systems may be, by way of example but not limited to, at least one communication network or combination thereof including, but not limited to, one or more wired and/or wireless communication network(s), one or more telecommunications network(s), one or more core network(s), one or more radio access network(s), one or more terrestrial communication network(s), one or more satellite communication network(s), one or more computer networks, one or more data communication network(s), the Internet, the telephone network, wireless network(s) such as the Worldwide lnteroperability for Microwave Access (WiMAX), near-far-communication(s) (NFC) systems, wireless local area networks (WLAN) based on the Institute of Electrical and Electronics Engineers' (IEEE) 802.11 standards e.g. Wi-Fi networks, or Internet Protocol (IF) networks, packet-switched networks or enhanced packet switched networks, IF Multimedia Subsystem (IMS) networks, or communications networks based on, by way of example but not limited to, wireless, cellular or satellite technologies such as mobile networks, Global System for Mobile Communications (GSM), GPRS networks, Wideband Code Division Multiple Access (VV-COMA), CDMA2000 or Long Term Evolution (LTE)/LTE Advanced networks or any 2nd, 3rd or 4th Generation and beyond type communication networks and the like. The term communication network or network as described herein may be used interchangeably.
[00114] A mobile device 106a may comprise or represent any device or equipment used for communications over a communications network. Examples of a mobile device that may be used in certain embodiments of the described apparatus, method(s), system(s), network(s), multi-user information sharing system(s), and/or multi-touch platform(s) are wireless devices such as mobile phones, terminals, smart phones, portable computing devices such as lap tops, handheld devices, tablets, netbooks, computers, personal digital assistants and other wireless devices that may connect to a communications network, and/or wired or fixed communication devices such as telephones, computing devices such as desktop computers, set-top boxes, televisions, or other devices that may connect to a communications network.
[00115] Some example mobile devices described herein are mobile devices 106a that include a processor, memory, communication interface and touch-screen display or mobile touch screen connected to each other and configured to communicate over a network with a multi-touch platform for allowing a user to join and collaborate in a communication session at a multi-touch platform 102a.
[00116] The mobile device may be configured to access the communication session in which the touch screen display is configured for displaying a user workspace associated with the communication session. The user workspace may be configured to receive user input for the user to perform further work on one or more of the data items of the set of data items received at the mobile device or further data items. It is to be appreciated by the skilled person that the mobile device 106a may be any device capable of receiving user input, displaying/outputting a communication session and being connected to a multi-touch platform 102a over a communication network may be suitable and used accordingly.
[00117] The inventors have found that it is possible to enhance collaboration between users by having a multi-touch platform 102a that uses a common workspace 109a rather than separate "personalised" workspaces for each user when working on data items during a communication session. The common workspace 109a promotes a more natural manner for users to exchange information and collaborate and/or work together on one or more data items and to control the transfer of data items between users. This provides the advantage of an efficient and improved method of information transfer between users of the system and efficient transfer of one or more data items between users, mobile devices and/or multi-touch platforms.
[00118] For example, the multi-touch platform 102a is configured to simultaneously receive via the display device 108a and common workspace 109a touch screen gestures from multiple users of the multi-touch platform 102a working on one or more data items and to display changes to the data items in the common workspace. This allows users of the multi-touch platform 102a to view the data items and/or changes to data items, further access the data items, and make further changes etc., and thus achieve a common result faster and more efficiently.
[00119] The multi-touch platform 102a may be configured to detect or receive, via the display device 108a, one or more touch screen gestures in the common workspace 109a from one or more users at the multi-touch platform 102a working on the set of data items, where the one or more touch screen gestures are associated with at least one of the data items. Each of the data items are processed based on the corresponding touch screen gestures associated with the data item, whereby the multi-touch platform 102a displays the data items in the common workspace based on the corresponding touch screen gesture. The one or more users use the common workspace to perform further work on the set of data items.
[00120] For example, in response to a corresponding touch screen gesture associated with transferring one or more data items, the one or more data items (or copies thereof) are transferred or transmitted from the multi-touch platform 102a to a device 102b or 106a associated with a user of the plurality of users. This allows users spaced around the multi-touch platform 102a with external devices to work on the transmitted data items via their devices. This also allows remote users of the plurality of users that have remotely joined the communication session via an associated device 102b or 106a to work on the transmitted data items and collaborate with the one or more users around the periphery of the multi-touch platform 102a. In addition, the multi-touch platform 102a is configured to receive one or more data items from a device 102b or 106a associated with one of the plurality of users (e.g. a user at multi-touch platform 102a with an external device 106a, a remote user at multi-touch platform 102b or a remote user using a mobile device 106a) for working on in the common workspace by one or more users at multi-touch platform 102a.
[00121] The plurality of users may include one or more local users positioned at the multi-touch platform 102a, one or more local users positioned at the multi-touch platform 102a with external devices 105a for accessing the communication session, and/or one or more remote users remotely accessing the communication session at the multi-touch platform 102a via one or more other devices 102b or 106a. For example, the device 102b or 106a may be a multi-touch platform 102b or a mobile device 106a, respectively.
[00122] The mobile device 106a may be configured to access the communication session at the multi-touch platform 102a and includes a mobile touch screen configured for displaying a user workspace associated with the communication session. The user workspace may be configured for a user to perform further work on one or more of the data items using the mobile device 106a. Additionally or alternatively, the device 102a may be another multi-touch platform 102b with a touch screen display device 108b displaying a common workspace 109b around which one or more of the remote users are spaced and can access the communication session at the multi-touch platform 102a and work on one or more of the data items via the communication network.
[00123] In another example, the data items may be, by way of example only but not limited to, one or more files or any data representative of digital data. A first user of the plurality of users may perform a touch screen gesture associated with transferring one or more of the data items or a selection of one or more data items from the first user to a second user of the plurality of users. The multi-touch platform 102a determines the direction of the detected touch screen gesture, and identifies that the direction of the touch screen gesture is associated with the second user. Based on this association, the one or more data items are transferred to the second user during the communication session.
[00124] Transferring the one or more data items between users may comprise or represent any method or process of communicating said data items from one user to another user. Examples of transferring data items between users that may be used in certain embodiments of the described apparatus, method(s), system(s), network(s), multi-user information sharing system(s), and/or multi-touch platform(s) include, by way of example only but not limited to, copying, moving or transferring said data from data storage accessible by a first user to data storage accessible by a second user; sending from a first user a link or shortcut pointing to said data in a data storage accessible by the first and/or second user to enable the second user to access said data; the data storage accessible by the first and/or second users may be associated with one or more multi-touch platforms 102a or 102b, one or more servers 104, one or more mobile devices 106b and may be accessible by the first and/or second users over a communication network.
[00125] The multi-touch platform 102a is configured to track the location of users around the multi-touch platform 102a or the address of a mobile device 106a or other multi-touch platform 102b of each of the plurality of users in the communication session from the time each user joins the communication session. Each user joins the communication session by logging into the multi-touch platform 102a by providing login credentials allowing the multi-touch platform 102a to authenticate each user to ensure that user is permitted to join the communication session. The multi-touch platform 102a, on authenticating each user, allocate a user interface to each user positioned at the multi-touch platform 102a, which allows each user to access one or more applications and at least one data item of the set of data items for working on in the common workspace 109a.
[00126] The multi-touch platform 102a may then determine or allocate and store a user identity based on the login credentials of each user and associate the user identity with the location of the user and associated user interface around the multi-touch platform 102a and/or the address of the mobile device 106a or other multi-touch platform 102b to enable the multi-touch platform 102a to communicate with each user, allow each user to work on one or more data items (e.g. files or data) associated with the communication session, communicate with users in the communication session, and/or allow users to transfer one or more data items to other users that have joined the communication session at the multi-touch platform either locally or remotely.
[00127] A user may log into the multi-touch platform 102a either a) manually using the touch screen 108a via the common workspace 109a associated with the multi-touch platform 102a; b) remotely via a wearable device 106c (e.g. using a Near-far communications reader); c) via a mobile device 106a that can access the multi-touch platform 102a over a communications network (e.g. VVi-H, Internet, Intranet, telecommunications network(s), or other communication network); and/or d) via another multi-touch platform 102b that is connected to the multi-touch platform 102a and that can access the multi-touch platform 102a over a communications network (e.g. Wi-Fi, Internet, Intranet, telecommunications network(s), or other communication network).
[00128] The multi-user information sharing system 100, multi-touch platform 102a, and/or multi-touch platform 102b and/or associated mobile devices 106a may be applied in many situations and for a variety of scenarios from, by way of example only but not limited to, real-time collaboration between users and remote users on various projects, project or operations management through to command and control center(s) for monitoring and controlling response teams such as remote response teams or personnel in the field (e.g. emergency response, search and rescue, maintenance teams, operations management, project management etc.).
[00129] For example, a multi-user information sharing system 100 may include one or more multi-touch platform(s) 102a for use in a maritime environment as command and control center(s) with local and/or remote users. A shipping operator may have a command and control center with a multi-touch platform 102a for monitoring and displaying the status of a fleet of ships via the common workspace 109a in which a plurality of mobile devices 106a is distributed amongst the fleet of ships. The ships may be displayed on the common workspace 109a, for example, a map of the world or region may be displayed with symbols representing the locations of each ship in the ship fleet. An operator may use the touch screen 108a to select a ship to determine the status of the ship, communicate with the ship or perform any other operation associated with the ship etc. Each ship may have one or more remote mobile devices 106a. Each ship may also have one or more remote multi-touch platform(s) 102b around which the command crew of the ship may communicate with the command and control center via the corresponding common workspace 109b over, by way of example but not limited to, a satellite communication network when the ship is in transit or in port and/or terrestrial communication network when the ship is in port.
[00130] One or more local operators of the multi-touch platform 109a of the command and control center may be using the multi-touch platform 102a to monitor the progress of various ships in port or in transit between ports. The ships may transmit, by way of example only but not limited to, actionable data, intelligence data, progress reports, or incident reports or any other data required by the command and control center from the multi-touch platform(s) 102b and/or mobile devices 106a on board the ships. Given the challenges operating ships in the maritime environment for shipping operators from, by way of example but not limited to, weather, weather related disasters, status and well-being of the crew through to piracy, it is important for all personnel in ships and/or ports to be able to collaborate on actionable data or intelligence (e.g. data items) in real-time to make informed and actionable decisions.
[00131] For example, in a piracy scenario a crew member of a ship may spot an unidentified vessel moving towards their ship. As a remote user, the crew member may use their remote mobile device 106a to capture an image (e.g. a photo) of the unidentified vessel and transmit a message containing the image as a data item to the remote multi-touch platform 102b operates by the command crew of their ship. The message and image may be displayed as an alert in the common workspace 109b of the remote multi-touch platform 102b allowing the command crew to immediately assess the image in real-time. The command crew may then send the alert as a message to the command and control center multi-touch platform 102a of the shipping operator for further analysis or request for assistance. The command and control center multi-touch platform 102a may display the alert or message linked to the symbol of the ship displayed on the map, and which, depending on the priority of the message, e.g. an alert, may display the message and image of the unidentified vessal, and other information, immediately. This allows one or more operators of the command and control center multi-touch platform 102a to analyse and respond to the message/alert in real time.
[00132] The operator may identify the unidentified vessel as a hostile vessel (e.g. a pirate vessel) and send a response message to the command crew of the ship via the multi-touch platform 102a (e.g. "we'll send help"). The operator may then send a request for assistance and the image of the unidentified vessel to another command and control center multi-touch platform 102b operated by the coast guard or military personnel near where the ship is located. This allows the coast guard or military personnel to mobilise their craft (e.g. ships, helicopters or aircraft) to assist the ship against the unidentified vessel. Thus, in this example, the multi-user information sharing system 100 allows real-time communication of actionable intelligence that enables the making of actionable decisions to be made to assist remote personnel in real-time.
[00133] In another scenario, the multi-user information sharing system 100 may be used by various organisations that have distributed assets and/or remote resources such as, by way of example only, oil and gas organisations that have various oil and gas equipment, platforms, and/or pipelines distributed across the world. For example, the multi-touch platform 102a may be used as command and control center by personnel at the headquarters of the oil and gas organisation which allows personnel to be informed on all aspects of the organisation/business such as, by way of example only but not limited to, project management, operations management, maintenance management and/or emergency response.
[00134] For example, in an emergency/maintenance response scenario, an alert may be raised by an oil pressure drop at one of the platforms or pipelines associated with the organisation and displayed on the common workspace 109a to the command and control personnel. An operator may identify the platform or pipeline and send a message to the remote users in the field such as maintenance crew on the platform/pipeline to investigate the possible leak. This message may be received by the platform/pipeline crew on their mobile devices 106a and prompt them to investigate in real time.
[001361A platform/pipeline crew member of the platform/pipeline may spot the leak and capture an image and/or report of the leak (e.g. actionable data/intelligence). This may be sent via their mobile device 106a over a communication network (e.g. a satellite or terrestrial communication network) to the command and control multi-touch platform 102a. The image and/or report may be assessed by the command and control personnel at the multi-touch platform 102a and, in response, the command and control personnel may deploy assistance (e.g. emergency leak response and/or maintenance teams) to assist the platform/pipeline crew in fixing and cleaning up the leak. This allows one or more personnel of the command and control center multi-touch platform 102a to analyse and respond appropriately to the messages/alerts/status reports in real time.
[00136] Although two example scenarios have been described, it is to be appreciated by the skilled person that the multi-user information sharing system 100 and associated multi-touch platform(s) 102a, 102b, and mobile devices 106a may be deployed and used in many other organisations such as by way of example only but not limited to, research organisations and/or universities, conference organisations, emergency search and rescue and/or response organisations, government and/or civil service organisations, police organisations, defence and/or military organisations, security organisations, private organisations or companies, oil and gas organisations, shipping and distribution operators, vehicle hire fleet operators, and/or any other organisation that requires local and remote users to collaborate in real-time, and further used in many situations and scenarios where real-time collaboration between local users (e.g. command and control center users) and remote users (e.g. users in the field or remotely located to the command and control center users) allows actionable decisions and responses to be made efficiently and effectively. This further optimises the collaboration and deployment of personnel and/or resources.
[00137] Figure 1b(i) is a schematic diagram illustrating the multi-touch platform 102a in which multiple users 114a-114g have logged into the multi-touch platform 102a and have each been allocated a user interface 140a-140g displayed in the common workspace 109a. In this example there are six local users 140a-140f and one remote user 140g. The remote user 140g may have joined the communication session via another multi-touch platform 102b or via a mobile device 106a in communication with multi-touch platform 102a. Each user 114a- 114f at the multi-touch platform 102a may be allocated a user interface 140a-140f by the multi-touch platform 102a for, among other things, accessing applications and the set of data items. The applications may be used simultaneously by the users in the common workspace 109a when working on one or more data items of the set of data items.
[00138] For the users 114a-114f positioned around the multi-touch platform 102a, the user interfaces 140a-140f allows each user to access applications and data items, and work on and process data items from a set of data items associated with the communication session.
The user interface 140a-140f may comprise a menu interface and/or a graphical icon representing the user. For the remote user 114g, the user interface 140g that is displayed on the common workspace 109a may comprise an notification interface and/or a graphical icon representing the user 140g in which the notification interface informs the other users 114a- 114f of the current status (e.g. connected, not connected, do not disturb, busy, away, in meeting etc.) of user 140g. For example, an image of each user may be displayed in each user interface 140a-140f allowing the other users 114a-114f to visually authenticate each other at the multi-touch platform 102a. An image of the remote user 114g may be displayed in user interface 140g allowing the other users 114a-114f to visually authenticate the other user during a video conference with the other user.
[00139] For each user 114a-114f manually logging into the multi-touch platform 102a via the touch screen 108a, the multi-touch platform 102a may determine and store a user identity associated with the user 114a-114f based on the user's login credentials. The multi-touch platform 102a may also associate or register the user identity with the position or location of the user interface 140a-140f in the common workspace 109a for each user 114a-114f positioned around the multi-touch platform 102a.
[00140] For example, the position of the user interface 140a-140f that is allocated to that user 114a-114f at the time the user logs into the multi-touch platform 102a may be selected by the user 114a-114f. Additionally or alternatively, the multi-touch platform 102a may allocate each user 114a-114f a position spaced around the common workspace 109a, which displays the user interface 140a-140f in the position where the corresponding user 114a-114f should stand when working in the common workspace 109a at the multi-touch platform 102a. The positions of the user interfaces 140a-140f may be changed by the user 114a-114f by dragging the user interface 140a-140f to a new location in the common workspace 109a.
[00141] Figure 1b(ii) is a schematic diagram of the multi-touch platform 102a of figure lb® showing an example login procedure for a first user 114a logging onto the multi-touch platform 102a. The first user 114a may be starting or initiating a communication session on the multi-touch platform 102a and may upload a set of data items from the server 104 to that are accessible via the user interface 140a in common workspace 109a of the multi-touch platform 102a. As well, the first user may also upload user preferences for configuring the common workspace 109a according to the tasks envisaged by the first user and/or based on the context of the uploaded set of data items etc. [00142] The data items can be displayed in the common workspace 109a on the display device 108a for manipulation (e.g. sharing, opening, reading, annotating etc.) during the communication session. Other users (not shown) may join the communication session by logging into the multi-touch platform 102a in which the multi-touch platform 102a allocates corresponding user interfaces 140b-140f. These allow the users to manipulate and work on (e.g. open, view, read, and/or annotate and save/write) one or more of the associated data items in the set of data items already uploaded. The other users may also upload further data items to the multi-touch platform 102a at the beginning or during the communication session.
[00143] In this example, the multi-touch platform 102a includes a login token 115 displayed as a graphical icon within the common workspace 109a on the touch screen 108a of the multi-touch platform 102a. The first user 114a may log onto the multi-touch platform 102a by inputting a touch screen gesture via the touch screen 108a using an object 111a (e.g. the user's hand or a stylus or other input device) to touch the login token 115 in the common workspace 109a in the vicinity of point A and drag the object 111a, whilst still touching the touch screen 108a, from position A to position B in the common workspace 109a.
[00144] The multi-touch platform 102a determines, in response to detecting the touch screen gesture of the user touching the login token and dragging the object 111e along path C to position B, on release of the object 111e that the first user 114a wishes to logon. The multi-touch platform 102a may then prompt the first user for their login credentials (e.g. via an input window etc.). On providing the login credentials, multi-touch platform 102a allocates a user interface 140a to the first user 114a such that the first user 114a may use the common workspace 109a.
[00145] The multi-touch platform 102a displays the user interface 140a at position Din the vicinity of the position B. It is to be appreciated by the skilled person that position D may correspond with position B of the touch screen gesture of the first user 114a. The multi-touch platform 102a stores an association between the user interface 140a and identify of the first user 114a and tracks the position of the user interface 140a in the common workspace 109a. The position of the user interface 140a in the common workspace 109a allows the multi-touch platform 102a to determine locate the most likely location around the common workspace 109a that the first user 114a is expected to be. The locations of the user interfaces 140a- 140g can be used to assist the multi-touch platform 109a in determining where to display one or more data items that may be transferred to/from the first user 114a and other users in the communication session at the multi-touch platform 102a.
[00146] Although a "dragging" type of touch screen gesture has been described to logon to the multi-touch platform 102a, this is by way of example only, it is to be appreciated by the skilled person that other touch screen gestures may be used to logon to the multi-touch platform 102a. For example, the user may simply touch the login token 115 displayed in the common workspace 109a, whereby the multi-touch platform 102a may then prompt the first user for their login credentials (e.g. via an input window etc.). The first user 114a may then provide their login credentials. On authenticating their credentials, the multi-touch platform 102a determines a position for the first user 114a in the common workspace 109a and allocates a user interface 140a to the first user 114a, which is displayed in the common workspace 109a at the position determined by the multi-touch platform 102a.
[00147] Although the above logon procedure(s) have been described, by way of example only, with reference to the first user 114a, it is to be appreciated by the skilled person that the logon procedure may be applied to further users 114b-114g wishing to logon and join the communication session at the multi-touch platform 102a.
[00148] For example, subsequent users 114a-114f may also login to the multi-touch platform 102a and join the communication session by dragging, via a touch screen gesture, the login token 115 into subsequent positions in the common workspace 109a and providing their login credentials. For each subsequent user, the multi-touch platform 102a may also store an association between the user interface 140b-140f (e.g. position of the user interface 140a- 140f) and the corresponding user 114b-114f, which allows the multi-touch platform 102a to locate the most likely position around the common work space 109a of the subsequent users 114b-114f. This assists in the transfer of one or more data items (e.g. files and/or data) to each subsequent user.
[00149] For mobile or remote users with mobile devices may be invited to join the communication session by users 114a-114f positioned at the multi-touch platform 102a. For example, the mobile user 114g may have a mobile application installed on their mobile device 106a that is configured to collaborate with the users 114a-114f at the multi-touch platform 102a. One of the users 114a-114f may send a request to mobile user 114g via email or other communication means (e.g. via the mobile application) to join a communication session with log in details (e.g. IP address of multi-touch platform and/or login credentials for the communication session etc.).
[00150] The mobile user 114g may use the mobile application to connect to the multi-touch platform 102a (e.g. by typing the IP address of the multi-touch platform 102a and/or the login credentials etc.) A handshake protocol may be performed where the mobile application sends a request, which may include the mobile IF address or subscriber identity number etc., to connect to the multi-touch platform 102a. The multi-touch platform 102a accepts the request based on the login credentials etc., and associates the mobile IF address with the mobile user 114g and a user interface 140g, which is displayed in the common workspace 109a as an icon or image of the user 114g. The user interface 140g associated with the mobile user 114g may be used by users 114a-114f at the multi-touch platform 102a to transfer one or more data items (e.g. files and/or data etc.) to that mobile user 114g or to receive one or more data items from that mobile user 114g.
[00151] In another example, mobile or remote users 114g with mobile devices 105a may log in to join the communication session via the mobile application installed on their mobile device 106a that is configured to collaborate with the users 114a-114f at the multi-touch platform 102a. The mobile application may be configured to display a possible connection to the multi-touch platform 102a and/or the communication session associated with the multi-touch platform 102a (e.g. the mobile application may search the network for communication sessions and/or a multi-touch platform 102a that is available for use in a communication session). The mobile user 114g may then select the multi-touch platform 102a and/or the communication session and join the communication session using log in details (e.g. login credentials for the communication session etc.).
[00152] The mobile user 114g may use the mobile application to connect to the multi-touch platform 102a (e.g. by typing the IP address of the multi-touch platform 102a and/or the login credentials etc.) As described previously, a handshake protocol may be performed where the mobile application sends a request, which may include the mobile IP address or subscriber identity number etc. and login credentials etc., to connect to the multi-touch platform 102a. The multi-touch platform 102a accepts the request based on the login credentials etc., and associates the mobile IP address with the mobile user 114g and displays a user interface 140g in the common workspace 109a, which may be an icon or image associated with the mobile user. The user interface 140g associated with the mobile user 114g may be used by users 114a-114f at the multi-touch platform 102a to transfer one or more data items (e.g. files and/or data etc.) to the mobile user 114g or to receive one or more data items from that mobile user 114g.
[00153] The server 104 may be configured to store, upload, and distribute the set of data items or further data items to the multi-touch platform 102a when the first user 114a begins a communications session on the multi-touch platform 102a. The server 104 is also configured to receive and save manipulated files from the multi-touch platform 102a and/or the other devices or platforms 102b and 106a-106c for storage. The mobile device 106a is configured to receive and manipulate files, and transfer files using the "swish" gesture to the multi-touch platform 102a or server 104.
[00164] Although the multi-touch platform 102a may communicate directly and/or wirelessly with the multi-touch platform 102b and/or mobile device(s) 106a, and wearable devices 106b etc., it is to be appreciated by the skilled person that the multi-touch platform 102b may communicate with the multi-touch platform 102b and/or mobile device(s) 106a via the server 104. For example, the server 104 may be configured to provide a communication path or connect the multi-touch platform 102a with the multi-touch platform 102b, which can be used when transferring one or more data items between users or between multi-touch platforms.
[00155] As another example, the server 104 may be configured to provide a communication path or connect the multi-touch platform 102a with one or more mobile devices 106a of the plurality of users that have joined the communication session, which can be also be used when transferring one or more files between users or between multi-touch platform 102a, 102b and the mobile devices 106a. Additionally, the mobile devices 106a may wirelessly communicate with the server 104 and/or the multi-touch platforms 102a and/or 102b (e.g. via VVi-Fi or a telecommunications network connection and the like).
[00156] Once users have logged onto the multi-touch platform 102a, they may retrieve one or more data items from a set of data items stored at the server 104 or multi-touch platform 102a. The set of data items may be uploaded in response to the first user 114 logging into the multi-touch platform 102a when initiating the communication session, or uploaded when a subsequent user logs onto the multi-touch platform 102a due to a higher level of clearance that subsequent user may have in relation to the set of data items. The set of data items may be accessible using the user interface 140a-1401 (e.g. via a file manager). Further data items may be uploaded by other users for sharing in the communication session.
[00157] Users may each access the same data items from their user interfaces 140a-140f. For example, to avoid data corruption when multiple users access the same data item, the multi-touch platform 102a may be configured to automatically assign a copy of the data item to each user 114a-114f and update the extension to each copy of the data item with one or more of the date, time and/or user identity. This allows users to access and modify the same data item without corrupting the original data item or being locked out of a data item while another user works on the same data item. Other examples of users working on the same one or more data items (e.g. files and/or documents etc.) are provided herein.
[00158] Figure 1b(iii) is a schematic diagram of the multi-touch platform 102a of figure lb@ showing an example procedure for a user 114e to change their position around the multi-touch platform 102a. In this example, users 114a, 114e and 114d have logged into the multi-touch platform 102a and have been allocated user interfaces 140a, 140e and 140d, respectively. At some stage in the communication session users 114e and 114d may need to work closer together in the common workspace 109a (e.g. they may be working or analysing a common data item or have been informed to work together to solve a particular problem/issue etc.). In this way, a group of users 114e and 114d may work together on a task at the multi-touch platform 102a, while other groups of users 114a may work on other tasks etc. That is the group of users 114e and 114d may form a first huddle of users at a location or one end of the multi-touch platform 102a.
[00159] In this case, user 114e changes their position at the multi-touch platform 102a by inputting a touch screen gesture using an object 111e (e.g. the user's hand or a stylus or other input device) to touch their user interface 140e in the common workspace 109a in the vicinity of point A and drag the object 111e, whilst still touching the touch screen 108a, from position A to position B along path C in the common workspace 109a. The user interface 140e may be displayed as being dragged along with the object 111e. When the user 114e releases their object 111e from the touch screen 108a and hence the common workspace 109a, the user interface 140e is re-positioned in the vicinity of position B. [00160] In figure lb(ii), the new position of user interface 140e is shown by dotted circle D in the vicinity of position B. The multi-touch platform 102a updates the association between the user interface 140a and identity of the user 114e by updating the previous position of the user interface 140e with the new position (e.g. position B). This allows the multi-touch platform 102a to track the position of a user 114a-114g and, in this example, the position of user 114e may be determined to assist the multi-touch platform 102a in determining where to display one or more data items that may be transferred to/from the user 114e and other users in the communication session at the multi-touch platform 102a.
[00161] As previously described with reference to figures 1a, 1b(i)-1b(iii), the position of the user interface 140a-140g in the common workspace 109a allows the multi-touch platform 102a to determine and/or identify the users and where the users are located around the common workspace 109a and/or within the system 100 to allow one or more data items (e.g. files or data) to be transferred from one user to another user at the multi-touch platform 102a, other multi-touch platform 102b, mobile device 106a, and/or wearable device 106c. For example, one or more data items may be transferred from one user's position, which is indicated by the position of the user interface 140a-140g, at the multi-touch platform 102a to another user's position indicated by the multi-touch platform 102a using a touch screen gesture (e.g. a "swish" type touch screen gesture or a "drag" type touch screen gesture for transferring files) that is determined to be directed towards the other user or the other user's user interface 140a-140g and hence the other user.
[00162] Similarly, in another example, a remote user 1149 may log into the multi-touch platform 102a remotely via a wearable device 106c using near-far communications. This may mean that this user is in close proximity to the multi-touch platform 102a, but may also need to be allocated a location or position at the multi-touch platform 102a to show other users 114a-114f of the multi-touch platform 102a that this user is present and has joined the communication session. In this example, the wearable device 106c provides the user credentials via near-far communications to the multi-touch platform 102a. The multi-touch platform 102a may determine and store a user identity associated with the user 114g based on the user's login credentials provided by the wearable device 106c. The multi-touch platform 102a may also associate or register the user identity with a user interface 140g allocated to the user 114g, which may, as previously described display an indicator or icon at a location in the common workspace 109a that is allocated by the multi-touch platform 102a to that user 114g. This allows the mulfi-touch platform 102a to transfer one or more files from one user to another user at the multi-touch platform 102a.
[00163] In another example, users may remotely log into the multi-touch platform 102a to join the communication session via a mobile device 106a. In this example, the multi-touch platform 102a may determine and store the user identity based on user credentials provided by the mobile device 106a. The multi-touch platform 102a may also determine the address of the mobile device 106a, which the multi-touch platform 102a may use for communicating with the mobile device 106a. The multi-touch platform 102a may use this information for identifying the user 114g and for transferring one or more data items (e.g. files) to/from the user 114g of the mobile device 106a. The address of the mobile device 106a may include, but is not limited to, an Internet Protocol address, mobile subscriber number, or any other address or number identifying the mobile device 106a over the communication network and allowing another device or multi-touch platform 102a to communicate with the mobile device 106a. The multi-touch platform 102a also associates or registers the address of the mobile device 105a with the user identity allowing the multi-touch platform 102a to transfer one or more files from one user in the communications session to another user of the mobile device 106a and vice versa. The multi-touch platform 102a may allocate a user interface 140g to the user 114g and also associate or register the user identity with the user interface 140g, which is displayed at a location in the common workspace 109a designated by the multi-touch platform 102a. This allows users at the multi-touch platform 102a to transfer data items via the user interface 140g of the user 114g to the user 114g of the mobile device 106a.
[00164] For example, the mobile device 106a may include a multi-touch platform mobile application (e.g. MTP mobile app) that configures the mobile device 106a to allow the user 114g of the mobile device 106a to remotely join the communication session and remotely work on the one or more data items (e.g. files) associated with the communication session. The MW mobile app is used to communicate with the multi-touch platform 102a and allow the user 114g or MW mobile app to enter or provide the user's login credentials remotely to the multi-touch platform 102a via the communications network. The mobile device 106a may provide the address of the mobile device 106a (e.g. IF address or other address) in the network that allows the multi-touch platform 102a to transfer one or more data items (e.g. files) via the communication network to the user 114g of the mobile device 106a and allow the user to work on the one or more data items (e.g. files), communicate with other users and contribute to or collaborate with users in the communication session. The multi-touch platform 102a stores a user identity associated with the user 114g based on the user's login credentials provided by the mobile device 106a as well as the address of the mobile device 106a at the multi-touch platform 102a. The multi-touch platform 102a associates or registers the user identity with a user interface 140g displayed at a location in the common workspace 109a of the multi-touch platform 102a. The user interface 140g may include an image of the user and notification panel or window indicating the status of the user 114g. This also allows users at the multi-touch platform 102a to transfer data items via the user interface 140g associated with user 114g to the mobile device 106a of user 114g.
[00165] In another scenario, another user may remotely log in and join the communication session via another multi-touch platform 102b using one or more of the above mechanisms or techniques for logging into the communication session via the multi-touch platform 102b. In this scenario, the other multi-touch platform 102b may determine and store a user identity based on the user credentials, and also associate a user interface or address of a mobile device to the user identity of the remote user. The other multi-touch platform 102b may send this user information including the user identity, user interface and/or address information to the multi-touch platform 102a for storing. The multi-touch platform 102a may also associate the address of the other multi-touch platform 102b with the user information and user identity.
This allows the multi-touch platform 102a to either communicate directly with the user or indirectly via the other multi-touch platform 102b with the user. The user information can be used by the multi-touch platform 102b for transferring one or more files to the user that manually or remotely logs into the communication session via the other multi-touch platform 102b.
[00166] For example, the multi-touch platform 102a may display a user interface in the common workspace 109a that displays multi-touch platform 102b and/or the users connected to the communication session via the multi-touch platform 102b. The user interface displayed in the common workspace 109a may display, by way of example, an icon representing the multi-touch platform 102b and, when requested, display a list of one or more users connected to the communication session via the multi-touch platform 102b. This may allow users 114a-114f at multi-touch platform 102a to transfer data items via the user interface to a particular user connected to the communication session via multi-touch platform 102b or to transfer data items via the user interface to the multi-touch platform 102b.
[00167] When transferring data items from a first user to a second user at the multi-touch platform 102a, if the second user is positioned at a location around the periphery of the common workspace 109a of the multi-touch platform 102a.(e.g. a multi-touch table), then the one or more data items may be displayed on the display device 108a as moving or skimming across the common workspace 109a and the touch screen of the multi-touch platform 102a towards the position of the second user in the common workspace 109a. If the second user is using a mobile device 106a external to the multi-touch platform 102a, then the one or more data items (e.g. files) may be displayed moving off the common workspace 109a of the multi-touch platform 102a, and when transferred to the mobile device 106a, may be displayed moving across the screen of the mobile device 106a. If the second user is located at another multi-touch platform 102b, then the one or more data items (e.g. files) may be displayed moving or skimming across and off the common workspace 109a on the touch screen of the multi-touch platform 102a, and subsequently, when the data items are transferred to the other multi-touch platform 102b, being displayed moving onto, and moving or skimming across the common workspace 109b on the touch screen of the other multi-touch platform 102b towards the location or position of the second user in the common workspace 109b.
[00168] Figure 1b0v) is a schematic diagram illustrating an example multi-touch platform 102a in which the common workspace 109a displayed on display device 108a may be divided into a plurality of zones comprising user interfaces 140a-140f, which also include corresponding user work spaces 112a-112f.
[00169] In this example, there is a user work space 112a-112f for each user at the multi-touch platform 102a. The common workspace 109a on the platform 102a allows all users to access associated data items (e.g. files and/or data) located in the common workspace 109a. The user interfaces 140a-140f and user workspaces 112a-112f may be an area of the touch screen display device 108a overlaid over the common workspace 109a and/or within the common workspace 109a that can be allocated to the user. Alternatively, the common workspace 109a may be configured to be of a shape in which the user interfaces 140a-140f and/or user work spaces 112a-112f are displayed outside and on the periphery of the common workspace 109a (e.g. the common workspace 109a has a "jigsaw" shape, where each user interface 140a-140f and/or user work space 112a-112f are positioned in the "notches" of the common workspace 109a).
[00170] The user work space 112a-112f of each user interface 140a-140f may simply be a notification interface or dashboard for monitoring status of the user, communications with other users, and/or may include or comprise further functionality that allows the user to "privately" work on, control, monitor and/or access on one or more data items (e.g. files or data etc.) and/or communicate with other users during the communication session (e.g. via audio/video such as video conference call or other means). It is to be appreciated by the skilled person that these functions may also be performed by the users 114a-114f in the common workspace 109a without the need for user workspaces 112a-112f.
[00171] In any event, each user interface 140a-140f may be used for accessing and working on one or more data items during the communication session within the user workspace 112a-112f and/or the common workspace 109a. For example, the first user 114a may insert data items they wish to be shared and manipulated by the other users 114b-114g into the common workspace 109a. Using the user workspace 112a-112f, these data items may be accessible via the user interface 140a-140f via a file manager, third party application software, file editor or viewer (e.g. pdf editor, video/audio player etc.) Additionally or alternatively, when data items are within a user workspace 112a-112f, then the data items may not be accessible or may be locked for use by other users of the multi-touch platform 102a.
[00172] Users 114a-114g may use a "grab" gesture to retrieve a version or copy of a data item from the common workspace 109a, for manipulation within their assigned user workspace 112a-112f or device(s) 106b or user workspaces of another multi-touch platform 102b. Annotations may also be made to data items (e.g. files or documents) in the common workspace 109a by one or more of the users. The users may collaborate together and generate a report file or finalised file in the common workspace 109a.
[00173] Figure 1b(v) is another schematic diagram showing another multi-touch platform 102a with a touch screen interface 108a and a common workspace 109a around which are placed user interfaces 140a-140f. In this example, the common workspace 109a is arranged to be centred on the touch screen 108a with the user interfaces 140a-140f and work spaces 112a- 112f surrounding the periphery of the common workspace 109a such that they are displayed on the touch screen interface 108a outside of and around the periphery of the common workspace 109a. Each user interface 140a-140f includes a user work space 112a-112f as described with reference to figure 1b(iv).
[00174] Figure 1b(vi) is a schematic diagram showing a further example multi-touch platform 102a for system 100 in which the multi-touch platform 102a designates locations to users at the multi-touch platform 102a using indicators 110a-110f. In this example, the indicators 110a-110f notify users where they are to be positioned around the multi-touch platform 102a. In this example each indicator 110a-110f may be a light or light emitting diode indicating to a user, when the user logs into the multi-touch platform 102a, where they should stand or are being positioned around the common workspace 109a by the multi-touch platform 102a. In this case, the multi-touch platform 102a may further display a user interface 140a-140f when each user logs into the multi-touch platform 102a.
[00175] Figure lc is a schematic diagram showing an example user interface 140 for the example multi-touch platform 102a of figures 1b(i)-1b(iii). As described with reference to figures 1b(i)-1b(iii), the user interface 140 may be displayed in the common workspace 109a of the multi-touch platform 102a at a position selected by the user and/or a position designated by the multi-touch platform 102a. The user interface 140 is allocated to each user positioned at the multi-touch platform 102a for accessing applications and the set of data items. The applications may be used simultaneously by the users in the common workspace 109a when working on one or more data items of the set of data items.
[00176] The user interface 140 for each user positioned around the multi-touch platform 102a is displayed in the common workspace 109a. The user interface may be allocated to the user and displayed in the common workspace 109a at the designated or selected location at the time the user logs onto the multi-touch platform 102a. The user interface may include a menu interface for allowing the user to access a set of applications and/or third party software and/or at least one data item from the set of data items for working on said at least one data item simultaneously using an application in the common workspace 109a. The user interface is a minimalist interface that is displayed in a position in the common workspace 109a indicating where the user should be positioned at the multi-touch platform 102a and to efficiently provide access to applications and at least one data item whilst being unobtrusive while users are working on the at least one data item.
[00177] The user interface 140 includes a menu interface with one or more menu items that may be accessible to the user using touch screen gestures allowing the user, once logged into the multi-touch platform 102a, to work on one or more files during the communication session. In this example, the user interface 140 includes a menu interface with a set of five menu items (or nodes) 142, 144, 146, 148 and 150 for working on one or more files via, by way of example only but not limited to, a file manager, third party application software, file editor or viewer (e.g. pdf editor, image viewer/editor, video/audio player etc.). A menu item may be selected to reveal a further set of sub-menu items (e.g. sister nodes) associated with the menu item and application(s) associated with the menu item.
[00178] A "1-finger touch gesture may be used to select and activate one of the menu items 142-150, where the "1-finger" touch gesture includes touching the menu item 142-150 displayed by the touch screen 108a with an object of the user (e.g. stylus or finger, etc.) and releasing the object from the touch screen 108a. Although a set of five menu items 142-140 are described, it is to be appreciated by the skilled person that a set of more or less menu items may be displayed in any configuration or order for use in the user work space interface 140.
[00179] In this example, the set of five menu items 142-150 is displayed in a circular fashion.
The first menu item 142 is a video player (e.g. YouTube (RTM) embedded in a browser window. A user may use a "1-finger" touch gesture to select the first menu item 142 represented as a camera icon that configures the multi-touch platform 102a to activate an image viewer/editor for working on data items such as image files. Images may be retrieved from local storage on the multi-touch platform 102a or via a server storage, storage device, or mobile device accessible by the user. The user may use a "1-finger" touch gesture to select the second menu item 144 represented by a video player icon (e.g. YouTube (RTM)), which may configure the multi-touch platform 102a to play a displayed data item such as a video file or allow the user to open and play a data item such as a video file in a video player embedded in a browser window 143.
[00180] The third menu item 146 is for activating third party software that may be installed on the multi-touch platform 102a. A user may select the third menu item 146 to reveal a set of sub-menu items (or sister nodes) associated with one or more third party software or application(s) 147, which may be selected to activate the respective third party software or application. For example, a user may use a "1-finger" touch gesture to select the third menu item 146 to reveal a sub-menu item(s) of third party software 147 available for selection. The multi-touch platform 102a may be configured as a thin client with third party software 147 pre-installed on the multi-touch platform 102a or on a server, cloud servers and/or as a cloud service and/or storage. The user may use a further "1-finger" touch gesture to select the third party software 147 or application associated with a sub-menu item.
[00181] The fourth menu item 148 is for activating a file manager 149 on the multi-touch platform to display one or more data items (e.g. files or digital data) accessible to the user of the corresponding user work space 112a. The one or more data items may be locally stored on the multi-touch platform 102a, stored via a network in a server or in cloud storage, or storage on portable storage or a mobile device associated with the user. In addition to using touch screen gestures as described with reference to figures 1 a-1b(vi), the user may upload one or more further data items via the file manager 149, which may be made accessible to other users during the communication session. The fifth menu item 150 is for activating, in this example, a portable document format (PDF) reader/annotation software for reading/annotating one or more pdf files. The fifth menu item 150 may include one or more sub-menu items associated with opening and working on different document or data types. A user may select the fifth menu item 150 to reveal a set of sub-menu items (or sister nodes) associated with working one or more different data items, documents or data types, which may be selected to activate the respective program for opening and working on one or more particular data items, documents or data types.
[00182] In other examples, the user interface may be configured as a dashboard style interface or one or more notification icons or images for notifying the user of new data items, messages, emails, conference calls, for instant messaging etc. In these examples, the user interface may be a "stripped" down interface that still allows users to access applications and one or more data items but requires the users to simultaneously use the common workspace 109a for working on one or more data items.
[00183] As described above with reference to figures lb(iv), the user interface 140 may also further include a user workspace (not shown). For example, the user workspace may be a notification interface or dashboard for displaying messages, emails, status indicators such as: monitoring status of the user; communications with other users; incoming communications or other notifications, video conference calling. Alternatively or additionally, the user workspace may include or comprise further functionality that allows the user to "privately" work on, control, monitor and/or access one or more applications and one or more data items (e.g. files or data etc.) and/or communicate with other users during the communication session (e.g. via audio/video such as video conference call or other means). However, it is to be appreciated by the skilled person that these functions may also be performed by the users in the common workspace without the need for a user workspace.
[00184] Figures 1d(i) and (ii) are schematic diagrams showing an example multi-touch platform 102a operating during an example communication session between two users collaborating with each other, e.g. a first user 114a and a second user 114b. Referring to Figure 140, the first user 114a is positioned at a location near the periphery of the common workspace 109a that is indicated by the position of user interface 140a, and the second user 114b is positioned at a location near the periphery of the common workspace 109a that is indicated by the position of user interface 140b.
[00185] When the first user 114a starts the communication session, the multi-touch platform 102a may allocate a designated location for the first user 114a and indicates this to first user 114a by displaying the user interface 140a in the common workspace 109a in the vicinity of the designated location. Alternatively or additionally, the user 114a may select a location, which the multi-touch platform 102a indicates to the user by displaying the user interface 140a in the common workspace 109a in the vicinity of the selected location. The server 104 is configured to upload a first set of data items (e.g. files) 116a-116c associated with the first user 114a to the multi-touch platform 102a. The second user 114b can also log on to the multi-touch platform 102a, where, if the user 114b has a mobile device 106b, then user 114b can register the mobile device 106b with the multi-touch platform 102a. The multi-touch platform 102a also may allocate a designated location for the second user 114b and indicates this to the second user 114b by displaying the user interface 140b in the common workspace 109a in the vicinity of the designated location. Alternatively or additionally, the user 114b may select a location, which the multi-touch platform 102a indicates to the user by displaying the user interface 140a in the common workspace 109a in the vicinity of the selected location.
[00186] The users 114a-114b can collaborate and work on (e.g. read/manipulate/ annotate/amend) the associated data items (e.g. files) 116a-116c uploaded by the first user 114a in the common workspace 109a. In addition, the second user 114b may also upload further data items (not shown) to the multi-touch platform 102a and/or server 104 for use during the communication session. During this process, the users 114a-114b may also generate or create via the common workspace 109a, by way of example, a data item 116d such as a report file, or modify one of the uploaded data items 116a-116b to generate the data item 116d (e.g. a report file). The multiple users 114a and 114b simultaneously use the common workspace 109a to work on one or more of the uploaded data items. The multiple users 114a and 114b may simultaneously use the common workspace 109a to work on the same data item or different data items of the one or more data items.
[00187] For example, the multi-touch platform 102a may be configured to detect whether multiple users of the plurality of users, e.g. the first user 114a and the second user 114b, are to work on one of the data items 116a-116d simultaneously, or substantially simultaneously. That is, as each of the first and second users 114a and 114b can work simultaneously on the common workspace 109a, each of the users 114a and 114b may wish to have access to the same data item (e.g. file 116a) at the same time, but may wish to work on the data item separately and/or at different times. Although allowing users to concurrently work on the same data item may speed up collaboration, as users do not have to wait for other users to finish with the data item, there may be a risk of corrupting the original data item.
[00188] Given this, the multi-touch platform 102a may be configured to provide each of the multiple users 114a and 114b with a copy of the data item 116a. This may enhance the speed of collaboration by allowing users to work on the data item 116a in the common workspace 109a concurrently, while at the same time the users 114a and 114b may discuss the changes both may like to make to the same or different pads of the data item 116a. This may not only result in an earlier agreement on the changes to make to data item 116a to generate the data item 116d (e.g. report file or finalised file), but will also speed up modifying each data item and thus speed up generation of the data item 116d (e.g. report file or finalised file). This is because users 114a and 114b are able to simultaneously work in the common workspace 109a and work concurrently modifying the data item by concurrently working on their copy of the data item.
[00189] The users 114a and 114b may then work on their corresponding copies of the data item 116a and, when finished merge their changes etc. to generate data item 116d or some other updated finalised data item. Should at least one of the users 114a and 114b use a mobile device 106a to work on a data item, then that user may transfer the data item back to the common workspace 109a of the multi-touch platform 102a before finishing the generated data item 116d.
[00190] Alternatively or additionally, the multi-touch platform 102a may be configured to receive the one or more of the copies of the data items worked on by the multiple users 114a and 114b, and to determine the changes between each copy of the data item with the original data item 116a. On determining the changes, the multi-touch platform 102a may generate a temporary data item by merging the original data item 116a with the changes made to the one or more copies of the data item. The multi-touch platform 102a may then be configured to verify with users 114a and 114b whether the changes are acceptable prior to finalising the temporary data item as a final data item 116d (e.g. a report file or other finalised file) for saving by server 104 and/or sending to another party or user etc. [00191] In another example, the multi-touch platform 102a may be configured to detect whether multiple users 114a and 114b of the plurality of users, e.g. first user 114a and second user 114b, are to work on a data item (e.g. file 116a) of the plurality of files 116a-116d simultaneously, or substantially simultaneously. That is, each user 114a and 114b may wish to have access to the same data item (e.g. file 116a) at the same time, but may wish to work on the date item separately, at the same time, at different start times in which the period they work on the data item overlaps. Although allowing users to concurrently work on same data item may speed up collaboration, as users do not have to wait for other users to finish with the data item, there may be a risk of corrupting the original data item.
[00192] Given this, multi-touch platform 102a may only allow each of the multiple users 114a and 114b to work on the data item serially. This may enhance collaboration by making the users 114a and 114b discuss the changes both would make and thus result in earlier agreement on the finalised data item (e.g. report or finalised file), while at the same time ensuring the original data item 116a is not corrupted. The multi-touch platform 102a may then be configured to verify with users 114a and 114b whether the changes are acceptable to the multiple users 114a and 114b prior to finalising a temporary data item as a data item 116d (e.g. report file or other finalised file).
[00193] Although the multi-touch platform 102a may locally store one or more data items (e.g. documents, files and/or digital data) for each of the multiple users 114a and 114b to work on in the common workspace 109a, it is to be appreciated by the skilled person that the data items (e.g. one or more documents, files and/or digital data) may be stored in one or more databases, or server(s) or other computer systems and storage units that may communicate with the multi-touch platform 102a. For example, the one or more data items (e.g. one or more documents, files and/or data) may be stored in a cloud-based storage system in which the digital data representing the data items (e.g. documents, files and/or digital data) worked on in the common workspace 109a by the one or more users 114a-114f at the multi-touch platform 102a is stored in logical storage pools, which may include one or more centralised and/or distributed servers and computing systems for storing the digital data, that is accessible by the multi-touch platform 102a.
[00194] Referring to figure 1d(ii), the display device 108a (e.g. touch screen or touch screen interface) of the multi-touch platform 102a (e.g. multi-sharing table) includes a graphical user interface (GUI) that displays the common workspace 109a and also allows the users 114a- 114b around the multi-touch platform 102a to simultaneously work in the common workspace 109a to manipulate the orientation of data items (e.g. information and/or files) that they are reading with a simple hand gesture or touch screen gesture (e.g. a "twist" or "grab" type gesture for rotating files to an orientation suitable for the user) that does not include pinning the object, specialist programming, and without disturbing other users.
[00195] As described with reference to figures lb(i)-(iii), the multi-touch platform 102a may operate with multiple user interfaces 140a-1401 one for each user. The multi-touch platform 102a may configure the user interfaces 140a-140f to allow users to work along any surface or orientation of the display device 108a on a horizontal multi-touch-platform 102a (e.g. anywhere within the common workspace 109a displayed on the touch screen display device 108a). At least one of the multiple users of the plurality of users, e.g. user 114b, may have a mobile device 106a such that data items may be transferred from the multi-touch platform 102a to the mobile device 106a for further working on by the user 114b. At least one of the multiple users may be located at another multi-touch platform 102b and may wish to work on one or more data items at multi-touch platform 102a. The transfer of at least one data item to the mobile device 106a or to another multi-touch platform 102b may be initiated using a gesture input or touch screen gesture input to the display device 1082 of the multi-touch platform 102a, (e.g. a "swish" gesture), that indicates a transfer of at least one data item (e.g. one or more file(s)) to the associated device 106b or table 102b is requested.
[00196] During the communication session, each of the data items 116a-116d can be grabbed, transferred or copied and manipulated by user 114a or a mobile device 106b of user 114b, or another multi-touch platform (not shown) with one or more other users (not shown) in which those users may also work on one or more of the data items 116a-116d and collaborate with the users on the multi-touch platform 102a. In this example, the first user 114a performs a gesture for "grabbing" the data item 116a for orienting the data item in line with the position of user 114a. Additionally or alternatively, users may use the "grab" gesture to retrieve a version or copy of a data item 116a-116d from the common workspace 109a of the multi-touch platform 102a for working on and/or manipulation within a user workspace(s) of a mobile device(s) 106a, or common workspace 109b of another multi-touch platform 102b. Annotations may also be made to data items in the common workspace 109a by one or more of the users when collaborating together.
[00197] As described with reference to figures lb(iv)-(v), in another example, users may use the "grab" gesture to retrieve a version or copy of a data item 116a-116d from the common workspace 109a, for working on and/or manipulation within their own user workspace (e.g. Figure ION/Hy) user workspaces 112a-112f) or device(s) 106a or user workspaces of another multi-touch platform 102b. Annotations may also be made to data items in the common workspace 109a by one or more of the users when collaborating together.
[00198] In the example shown in figure 1d0i), the first user 114a performs a touch screen gesture for "grabbing" the data item 116a in the common workspace 109a to orient the data item 116a in line with the position of the first user 114a. On detecting the touch screen gesture representative of "grabbing" the data item 116a, multi-touch platform 102a is configured to rotate and orient the data item 116a to allow user 114a to read and/or work on and/or manipulate data item 116a in the common workspace 109a of the multi-touch platform 102a. The second user 114b may also perform a touch screen gesture for transferring the data item 116b to the mobile device 106b of user 114b. In this case, the second user 114b performs a "swish" touch screen gesture for indicating a transfer of the data item 116b to the mobile device 106b of user 114b. The multi-touch platform 102a, on detecting that the "swish" touch screen gesture of user 114b is directed towards user 114b (e.g. the multi-touch platform detects that the "swish" gesture of user 114b is directed towards the vicinity of user interface 140b of the second user 114b), transfers the data item 116b to the mobile device 106b registered by the second user 114b. On receiving the data item 116b, user 114b may read and work on and/or manipulate the data item 116b. User 114b may then, with another "swish" touch screen gesture input to the touch screen of the mobile device 106b, transfer the data item 116b, which may have been modified or worked on by user 114b, to the multi-touch platform 102a as data item 116c.
[00199] The "swish" touch screen gesture may be defined by an object of the second user 114b (e.g. a stylus or a finger of the second user 114b) touching the touch screen 108a of the multi-touch platform 102a or the mobile device 106b where the file is displayed and then moving or dragging the object of the second user 114b across the touch screen 108a, whilst the object still touches the touch screen 108a, in a swift "swish" action or a rapid movement of the object in the direction associated with where the second user 114b would like the data item 116b to be transferred, with the gesture ending on removal of the object from the touch screen 108a. The gesture is performed as a continuous action. The multi-touch platform 102a then determines the direction of the gesture and identifies the likely user that the gesture is directed toward before transferring the file.
[00200] As an example, the multi-touch platform 102a may detect that the data item 116b is being worked on in the common workspace 109a in the vicinity of user interface 140b. Thus, the multi-touch platform 102a may determine that user 114b is working on the data item 116b. The multi-touch platform 102a may then analyse a "swish" gesture of user 114b associated with the data item 116 and extrapolate a vector or direction of the gesture to determine whether this intersects with the periphery or an edge of the common workspace 109a and/or the touch screen display 108a. The multi-touch platform 102a may then determine which of the user interfaces 140a or 140b is in the vicinity of this intersection. The user interface 140a or 140b (and/or, if any, user workspace 112a-112f or indicator 110a-110f) that is closest to the intersection may be used to determine the whether the data item 116b is transferred to another user 114a at the multi-touch platform 102a or towards the user 114b that generated the "swish" gesture. In the case of figure 1d(ii), the intersection is in the vicinity of or closest to user interface 140b, in which case, the multi-touch platform 102a determines, since mobile device 106a has been registered to user 114b, that user 114b is requesting to have the data item 116b transferred to the mobile device 106a. In this way, the data item 116b is transferred from the multi-touch platform 102a to the mobile device 106a of user 114b.
[00201] As another example, the multi-touch platform 102a may be configured to detect a touch screen gesture comprising the "swish" touch screen gesture by determining when an object of the second user 114b touches the display device 108 or touch screen of the multi-touch platform 102a and moves the object in the direction associated with where the second user 114b would like the data item 116b to be transferred towards. In the above example, the second user 114b moved their object in the direction of themselves (e.g. in a direction closest to or associated with the user interface 140b), which is interpreted by the multi-touch platform 102a as transferring the data item 116b to the mobile device 106b associated with the second user 114b.
[00202] The second user 114b may instead perform a "swish" touch gesture to transfer the data item 116b towards the first user 114a. In this case, the multi-touch platform 102a determines that the direction of the "swish" touch gesture is in the direction associated with the first user 114a, i.e. towards the first user 114a (e.g. in a direction closest to or associated with the user interface 140a), and so, on releasing the "swish" touch gesture, the data item 116b may be transferred to the first user 114a. On arriving at or near the location of the first user 114a or near the position of the user interface 140a of the first user 114a, the file 116b can be re-oriented for viewing and working on by the first user 114a.
[00203] The direction associated with the first user 114a may be based on a vector determined to be pointing from the second user 114b towards the first user 114a or towards the user interface 140a, or in the vicinity of the user interface 140a where the first user 114a is likely to be positioned based on the position of the user interface 140a in the common workspace 109a. For example, the direction associated with the first user 114a may be based on a vector determined to be pointing from when the second user 114b began the touch gesture towards the user interface 140a of the first user 114a, or, if any, a user workspace or an indicator associated with the first user 114a. The position of the user interface 140a may identify where the first user 114a is located at the multi-touch platform 102a, and/or where the mobile device 106b of the first user 114a is located or indicated to be located to the second user 114b. The user interface 140a may include a user workspace of the first user 114a, or an indicator indicating a designated position of the first user 114a. The user interface 140a associated with the first user 114a may also be represented as an icon on a touch screen of a mobile device 106b associated with the second user 114b. Thus, the second user 114b can perform a "swish" touch screen gesture on the mobile device 106a in the direction of the representation of the user interface 140a associated with the first user 114a in order to transfer a file to the first user 114a at the multi-touch platform 102a.
[00204] As described above and with reference to figure 1d(ii), the second user 114b may be one of the users with a mobile device 106b external to the multi-touch platform 102a, where the data item 116b is transferred from the common workspace 109a on the multi-touch platform 102a to the mobile device 106b of the second user 114b.
[00205] Multi-touch platform 102a may be configured such that data items (e.g. data and/or files) may also be presented by the users to the multi-touch platform 102a. Once the data item is presented ("swished") to the multi-touch-platform 102a, the data item can be "grabbed" by one or more users, orientated for reading and manipulated accordingly. In addition, the first user 114a may present one or more associated data items to each user 114a-114b (i.e. like a paper folder bring passed around a board/meeting table), whom may be able to manipulate the associated data. A user 114b can also use gestures on the multi-touch platform 102a to transfer a data item 116b to their mobile device 106b (e.g. a "swish" gesture may be used to transfer the file to their mobile device 106b) for working on in their private space ("pull-back"). That user 114b may then "swish" any worked on data items 116c (e.g. documents or files) back to the multi-touch-platform 102a. Whole team reports can be written via the multi-touch-platform 102a and submitted to parties via a communication interface to a network connection of a network.
[00206] Figures 1d(iii) and 1d(iv) are schematic diagrams showing example touch screen gestures for an example multi-touch platform 102 as described herein with reference to figures la-1c and/or figures 1e-10. The multi-touch platform 102 includes a plurality of users including users 114a-114f at multi-touch platform 102 collaborating on a plurality of data items (e.g. files) in a communication session on the multi-touch platform 102. The multi-touch platform 102 is configured to detect, within the common workspace 109 on a touch screen 108 associated with the multi-touch platform 102, a touch screen gesture for transferring one or more data items 116 (e.g. files) from a first user 114e of the plurality of users to a second user 114b of the plurality of users. The multi-touch platform 102 determines a direction of the detected touch screen gesture, identifies the direction of the touch screen gesture is in a direction associated with the second user 114b, and transfers the one or more data items 116 to the second user 114b during the communication session. In these examples, the multi-touch platform 102 has a common workspace 109. Once the second user 114b is identified based on the touch screen gesture and is different to the first user 114e, the multi-touch platform 102 transfers the data item 116 to within the vicinity of the position the user interface 140b of the second user 112b is located on the common workspace 109.
[00207] Figure 1d(iii) shows an example of a "swish" touch screen gesture for transferring one or more data items from a first user 114e to a second user 114b. In this example, the multi-touch platform 102 is configured to detect the touch screen gesture for transferring the one or more data items by detecting the first user 114e touching the touch screen with an object 111e of the first user 114e at a point A in the vicinity of said one or more data items 116, and prior to the first user 114e removing the object 111e at point B from the touch screen 108, the multi-touch platform 102 screen detects, the object 111e moving along a path C in a direction associated with the second user 114b. The multi-touch platform 102 may also determine whether the touch screen gesture for transferring the one or more data items is a valid gesture for transferring data items by determining whether the speed of the object 111e moving from A to B is greater than a user defined or predefined threshold.
[00208] The direction associated with the second user 114b may be determined by extrapolating a vector D determined from the touch screen gesture, where the direction of the vector D points in the direction of the user interface 140b or within the vicinity of the user interface 140b of the second user 114b. For example, the vector D may be determined from the initial touch point A and the final touch point B. The vector D is further extrapolated along a line intersecting points A and B in the direction of the vector D until it intersects the edge or periphery of the common workspace 109 and/or touch screen 108 of the multi-touch platform 102. The user interface 140b corresponds to this intersection (e.g. the user interface 140b may be the closest user interface within the vicinity or within a pre-determined threshold distance, radius or diameter of the intersection) may be used to identify the direction associated with the second user 114b and ostensibly the second user 114b for transferring the one or more data items towards.
[00209] Once the user 114b and the user interface 140b is identified, the multi-touch platform transfers the one or more data items from the first user 114e to that of the second user 114b. For example, the multi-touch platform 102 may display the one or more data items 116 moving from the first user 114e towards the user interface 140b of the second user 112b along a path E. As another option, the multi-touch platform 102 may also allow a copy of the one or more data items 116 to be transferred from the first user 114e to the user interface 140b of the second user 114b. This may allow both users to continue to work on the one or more data items 116 substantially simultaneously.
[00210] Alternatively, as for figures lb(iv) and 1b(v) in which the user interfaces 140b and 140e may include or comprise user workspaces 112b and 112e, respectively. The multi-touch platform 102 may determine a user workspace 112b and 112e that intersects with the extrapolated vector such that the one or more data items 116 may be transferred to the corresponding user workspace 112b or 112e. For example, analysing the edges of the user workspaces 112b or 112e that intersect with the extrapolated vector may assist in determining which workspace to transfer the data items towards. If the direction of the vector is away from the user 112e i.e. towards the edge of user workspace 112e farthest from user 112e, then it may be assumed that the direction is towards another user, thus the user workspace 112b other than the user 112e that intersects with the extrapolated vector may be determined to be where the data items should be transferred towards. The one or more data items 116 may be transferred by removing the one or more data items 116 from the user work space 112e of the first user 114e and appearing in the user work space 112b of the second user 112b. As another option, the multi-touch platform 102 may also allow a copy of the one or more data items 116 to be transferred from the user work space 112e of the first user 114e to the user work space 112b of the second user 114b. This may allow both users to continue to work on the one or more data items 116 substantially simultaneously.
[00211] Figure 1d(iv) shows an example of a "drag" type touch screen gesture for transferring one or more data items 116 from the first user 114e to the second user 114b. In this example, the multi-touch platform 102 displays on the common workspace 109 of the touch screen 108 an indicator 107 representing a second user 114b (e.g. Ux). The multi-touch platform 102a allocates the indicator 107 to the second user 114b and associates the position of the indicator 107 to the second user 114b. The indicator 107 representing the second user 114b may be a user interface and may be represented as a graphical icon. If the second user 114b is a remote user, the user interface may include a status notification associated with the second user 114b (e.g. connected, away, busy, do not disturb etc.) as described with respect to user 114g and user interface 140g of figure lb(ii). The second user 114b may be a second user 114b that remotely joins the communication session using a mobile device 106a or another multi-touch platform. Alternatively or additionally, the second user 114b may also be a user present at the multi-touch platform 102.
[00212] In any event, the multi-touch platform 102 is configured to detect the "drag" type touch screen gesture for transferring the one or more data items by detecting the first user 114e touching the touch screen with an object 111e of the first user 114e at a point A in the vicinity of said one or more data items 116, and prior to the first user 114e removing the object 111e from the common workspace 109 and/or the touch screen 108 at point B in the vicinity of the indicator 107, the multi-touch platform 102 screen detects, the object 111e moving along a path C in a direction associated with the indicator 107 of the second user 114b. At point B, the object 111e of the first user 114e may intersect the indicator 107 associated with the second user 114b. The multi-touch platform 102 may also determine whether the touch screen gesture for transferring the one or more files 116 is a valid "drag"-type gesture by determining whether the speed of the object 111 moving from A to B is greater or less than a user defined or predefined threshold.
[00213] The direction associated with the second user 114b is based on releasing the object 111e from the touch screen 108 when the object 111e of the first user 114e is in the vicinity of the indicator 107 associated with the second user 114b. The object 111e may intersect the indicator 107 or be located within the indicator 107 when the object 111e is released from the touch screen 108. The multi-touch platform 102a may then determine that the one or more data items 116 are to be transferred to the second user 114b. The multi-touch platform 102a may look-up the association between the indicator 107 and the second user 114b to determine where the data items 116 should be transferred towards. If the second user 114b is a remote user connecting via a mobile device, then the multi-touch platform 102a will have also stored the address of the mobile device of the second user 114b and be able to transfer the data items over the communication network to the mobile device.
[00214] Figure 1d(v) is a schematic diagram showing an example of a "swish" touch screen gesture for transferring one or more data items from a first user 114e to a mobile device 106e of the first user 114e. It is assumed that the first user 114e has registered the mobile device 106e with the multi-touch platform 102 and that the mulfi-touch platform 102 has stored an association of the address of the mobile device 106e with the first user 114e. Typically, this may be performed when the first user 114e logs on to the multi-touch platform 102 manually using the common workspace and afterwards logging or registering their mobile device 106e with the multi-touch platform 102 (e.g. via a multi-touch platform mobile application, which may register the device 106e with the platform 102).
[00215] In this example, the multi-touch platform 102 is configured to detect the touch screen gesture for transferring the one or more data items 116 by detecting the first user 114e touching the touch screen with an object 111e of the first user 114e at a point A in the vicinity of said one or more data items 116, and prior to the first user 114e removing the object 111e at point B from the common workspace 109 and/or touch screen 108, the multi-touch platform 102 screen detects, the object 111e moving along a path C in a direction associated with the first user 114e. The multi-touch platform 102 may also determine whether the touch screen gesture for transferring the one or more data items 116 is a valid gesture for transferring data items by determining whether the speed of the object 111e moving from A to B is greater than a user defined or predefined threshold.
[00216] The direction associated with the first user 114e may be determined by extrapolating a vector D determined from the touch screen gesture, where the direction of the vector D points in the direction of the user interface 140e or within the vicinity of the user interface 140e of the first user 114e. For example, the vector D may be determined from the initial touch point A and the final touch point B. The vector D is further extrapolated along a line intersecting points A and B in the direction of the vector D until it intersects the edge or periphery of the common workspace 109 and/or touch screen 108 of the multi-touch platform 102. The user interface 140e that corresponds closest to this intersection (e.g. the user interface 140e may be the closest user interface within the vicinity or within a pre-determined threshold distance, radius or diameter of the intersection) may be used to identify the direction associated with the first user 114e and ostensibly that the data items 116 should be transferred to a mobile device 106e of the first user 114e.
L00217] Figure 1d(vi) is a schematic diagram showing an example of another "swish" touch screen gesture for transferring one or more data items 116 from the mobile device 106a of the first user 114e to the multi-touch platform 102. It is assumed that the first user 114e has registered the mobile device 106e with the multi-touch platform 102 and that the multi-touch platform 102 has stored an association of the address of the mobile device 106e with the first user 114e. Typically, this may be performed when the first user 114e logs on to the multi-touch platform 102 manually using the common workspace and afterwards logging or registering their mobile device 106e with the multi-touch platform 102 (e.g. via a multi-touch platform mobile application, which may register the device 106e with the platform 102).
[00218] In this example, the mobile device 106e is configured via a multi-touch platform mobile application to detect the touch screen gesture for transferring the one or more data items 116 by detecting the first user 114e touching the touch screen 108e in the user workspace 109e of the mobile device 106e with an object 111e of the first user 114e at a point A in the vicinity of said one or more data items 116, and prior to the first user 114e removing the object 111e at point B from the common workspace 109 and/or touch screen 108, the mobile device 106e detects, the object 111e moving along a path C in a direction associated with the multi-touch platform 102. The mobile device 106e may also determine whether the touch screen gesture for transferring the one or more data items 116 is a valid gesture for transferring data items by determining whether the speed of the object 111e moving from A to B is greater than a user defined or predefined threshold. The direction associated with the multi-touch platform 102 may be determined by extrapolating a vector D determined from the touch screen gesture, where the direction of the vector D points in the direction of the multi-touch platform 102.
[00219] For example, the vector D may be determined from the initial touch point A and the final touch point B. For example, the multi-touch platform 102 may be displayed as an icon in the user workspace of the mobile device 106a, other icons may be displayed in the user workspace representing other platforms and/or users of the multi-touch platform. The vector D may then be further extrapolated along a line intersecting points A and B in the direction of the vector D until it intersects the icon in the user workspace or the edge or periphery of the user workspace 109e and/or touch screen 108e of the mobile device 106e. The icon that corresponds closest to this intersection (e.g. the icon representing the multi-touch platform 102 may be the closest within the vicinity of the intersection or within a pre-determined threshold distance, radius or diameter of the intersection) may be used to identify the direction associated with the multi-touch platform 102 and ostensibly that the data items 116 should be transferred to the multi-touch platform 102.
[00220] In another example, the multi-touch platform 102 may be represented on the mobile device 106e as one of the edges of the user workspace 109e or the touch screen 108e. The vector D may be determined from the initial touch point A and the final touch point B. The vector D may then be further extrapolated along a line intersecting points A and B in the direction of the vector D until it intersects an edge of the user workspace 109e and/or the touch screen 108e of the mobile device 106e. If the edge corresponds to the edge representing the multi-touch platform 102, then the data items 116 are transferred to the multi-touch platform 102. It is to be noted, that other edges may correspond to other multi-touch platforms. Although an edge of the user workspace 109e and/or touch screen 108e is described, by way of example, as representing a multi-user platform 102, it is to be appreciated by the skilled person that one or more edges of the user workspace 109e and/or touch screen 108e may be divided into multiple segments, one of which may represent the multi-user platform 102 and the other segments representing other multi-touch platforms and/or users.
[00221] Figure le is schematic diagram showing a multi-user information sharing system 100 including multi-touch platforms 102a and 102b with a plurality of users 114a-114f, 115a-115f and 117 during a communication session. For simplicity, the server 104 is not shown. Multi-touch platform 102a includes users 114a-114f simultaneously using common workspace 109a and collaborating or working on one or more data items (not shown), multi-touch platform 102b includes users 115a-115f simultaneously using common workspace 109b and also collaborating or working on the one or more data items (e.g. files). User 117 is an external or remote user with a mobile device 106a with a user workspace that is used for collaborating with the other users 114a-114f and 115a-115f and working on the one or more data items. During the collaboration, the users 114a-114f, 115a-115f, and 117 may transfer files between each other to allow other users to work on the data items using "swish" gestures, or the users may "grab" one or more data items to work on the one or more data items on the common workspaces 109a or 109b and/or, if any in designated user workspaces (not shown). Data representative of the data items may be transferred between multi-touch platforms 102a and 102b and/or between multi-touch platforms 102a and 102b and mobile device 106a of user 117.
[00222] The users 114a-114f, 115a-115f and 117 may collaborate together to analyse data items, generate actionable data, generate further data items (e.g. a report or finalised file), transfer the data items or actionable data (e.g. report) from one group of users 114a-114f at multi-touch platform 102a to another group of users 115a-115f at multi-touch platform 102b, and/or to external user 117 with mobile device 106a, for verifying the data items, actionable data, changes made and/or approval of the data items (e.g. report or finalised documents). It is to be appreciated that there may be several iterations in which the data items (e.g. files, reports and/or finalised files) are transferred between users 114a-114f, 115a-115f and 117 before agreement is made.
[00223] For simplicity, the example multi-touch platforms 102a and 102b have been illustrated, by way of example only, to have a non-rectangular "wrap-around" structure that allows more users to work around the multi-touch-platform 102a in the form of a table. The multi-touch platform 102a has been illustrated and described, by way of example only, as a hardware system having a non-rectangular multi-touch-table (MTT1) with a non-rectangular table-top with a display device 108a and common workspace 109a. In other examples, the multi-touch table may be embedded with a lighting system (e.g. i1-i6) represented by indicators 110a-110f. Although the multi-touch platform 102a is illustrated as being a non-rectangular structure, it is to be appreciated by the person skilled in the art that the multi-touch platform 102a may take on any closed form shape such as, by way of example, a circular, ellipsoidal, triangular, square, rectangular, hexagonal, octagonal shape, or any suitable closed form shape allowing multiple users to work around the multi-touch platform 102a.
[00224] Figure 1f(i) is a schematic diagram showing an example multi-touch platform 102a in the multi-sharing information system 100 with a plurality of users 114a-114f positioned around the platform 102a at designated locations or user selected locations that may be indicated by user interfaces (not shown). For simplicity, the platform 102a is accommodated in a non-rectangular table-top (wrap-around") which facilitates users to locate themselves around the table with little/no-instruction. Although the current embodiment is designed to be a 6-sided table-top for 6-users, it is to be appreciated by the person skilled in the art that the platform 102a may be designed for an n-sided table-top for m-users, where n and m are greater than 0.
[00225] For example, the multi-touch platform 102a may have a table-top that can be designed for the number of users required to interface with the platform 102a. For example, an 8-sided table-top may be used for 8-users, a 3 sided table-top may be used for 3 users, or a single sided circle may be used for n>1 users, depending on the size of the circle etc. With exception for 1 to 4-users where the shape can be custom-made to suite user requirements.
[00226] Due to the shape of the table of the multi-touch platform 102a, the multi-touch platform 102a may assign user positions at designated locations using user interfaces displayed on the common workspace 109a when each user logs onto the multi-touch platform 102a. Alternatively or additionally, when each user logs onto the multi-touch platform 102a, the users may select their locations around the periphery of the common workspace 109a and the multi-touch platform 102a displays the user interfaces on the common workspace 109a corresponding to the selected locations of each user.
[00227] Figure 1f(i) further illustrates that the multi-sharing information system 100 is served by various subsystems located inside and outside of the multi-touch platform 102. The multi-touch platform 102a may include a memory with a multi-touch platform application/interface that may comprise computer instructions, which when executed on a processor of the multi-touch platform 102a, causes the processor to perform various functions via 10 controllers and/or communication interface(s) that include, by way of example only, controlling and implementing a user position / menu system 120, a user identification system 126 for identifying users 114a-114f via, by way of example only, wearable device(s) 106a, mobile device(s) 106b or via manually logging into the multi-touch platform 102a, and a multi-touch large display system 128 for controlling the display device 108a and/or common workspace 109a of the multi-touch platform 102a, and server system 130 for controlling the interactions between the multi-touch platform 102a and the server 104.
[00228] Figure 1f0i) is a schematic diagram showing another example multi-touch platform 102a as described with reference to figure lb(vi) for the multi-sharing information system 100. In this example, the multi-touch platform 102a has a plurality of users 114a-114f positioned around the platform 102a in designated locations indicated by indicators 110a-110f. The multi-touch platform 102a may assign user positions at designated locations via indicators 110a-110f when each user logs onto the multi-touch platform 102a. The indicators 110a-110f may be in the form of embedded lighting or an indicator displayed in the common workspace 109a by display device 108a (not shown) of the multi-touch platform 102a.
[00229] Figure 1fiii) further illustrates that the multi-sharing information system 100 is served by various subsystems located inside and outside of the multi-touch platform 102a. The multi-touch platform 102a may include a memory with a multi-touch platform application/ interface that may comprise computer instructions, which when executed on a processor of the multi-touch platform 102a, causes the processor to perform various functions via 10 controllers and/or communication interface(s) that include, by way of example only, controlling and implementing a user position lighting system 121, a platform status lighting system 122, an overall lighting system 124, a user identification system 126 for identifying users 114a-114f via, by way of example only, wearable device(s) 106a, mobile device(s) 106b or via manually logging into the multi-touch platform 102a, and a multi-touch large display system 128 for controlling the display device 108a and common workspace 109a of the multi-touch platform 102a, and server system 130 for controlling the interactions between the multi-touch platform 102a and the server 104.
[00230] Figure lg illustrates another example of the multi-touch platform 102a for a communication session with up to six users 114a-114f (e.g. U1-U6) and their external devices (e.g. stylus, mobile devices 106b with touch screens, wearable devices 106a, etc.) that are located adjacent to the table-top of the multi-touch platform 102a. In addition, the multi-touch platform 102a can also communicate remotely to other multi-touch platforms (e.g. MTT's), and other external users (e.g. Ux) and their mobile devices 106b.
L002311 Users 114a-114f may have access to wearable and/or mobile devices 106a, 106b. Using near-field-communications and/or other wireless communications techniques, the multi-touch platform 102a via the user identification system 126 may be configured to communicate via the communication interface of the multi-touch platform 102a with a user's 114b wearable device 106a and/or mobile device 106b to detect their presence and log the user 114b in to the multi-touch platform 102a. The user position lighting system 121 of the multi-touch platform 102a may further configure the multi-touch platform 102a to assign a position around the table of the multi-touch platform 102a which is indicated via a location light indicator 110b.
[00232] The platform status lighting system 122 may also configure the indicators 110a-110f, e.g. as embedded lighting or an indicator on the display device 108a displayed via common workspace 109a, to indicate the status mode of the multi-touch platform 102a and/or the multi-sharing information system 100. For example, the platform status lighting system 122 may display status modes that may include, but are not limited to: "off', in which the indicators 110a-110f are not lit (no lights); "on" and not ready for use, in which the indicators 110a-110f are lit with "BLUE" lights when at least one user (e.g. user 114a) is positioned at a user location, but the multi-touch platform 102a may not be ready for use; "on" and ready for use, in which one or more indicators 110a-110f are lit with "GREEN" light, which indicates the multi-touch platform 102a is ready for use; and "on" with an error, in which one or more of the indicator lights 110a-110f are lit with RED light.
[00233] Although the indictors 110a-110f are described with reference to specific colours (e.g. "BLUE", "GREEN", "RED"), this is by way of example only, and it is to be appreciated by the skilled person that the indicators 110a-110f and status modes may use any suitable coloured light. Although the platform status lighting system 122 is described as indicating the status of the platform via indicators 110a-110f, this is by way of example only, it is to be appreciated by the skilled person that the platform status lighting system 122 may configure further indicators or embedded lighting systems (not shown) on the multi-touch platform 102a. For example, the platform status lighting system 122 may configure further indicators or embedded lighting systems underneath the multi-touch platform 102a.
[00234] Figures 1g(i)-(iv) illustrate examples of multi-touch platform 102a with a display device 108a and common workspace 109a. In this example, the display device 108a is a multi-touch screen (e.g. MTT1) that displays a common workspace 109a that is large enough for allowing at least six users 114a-114f to crowd around it at designated locations represented by indictors 110a-110f. For example, the multi-touch screen 108a may be at least 46" to allow the at least 6-users 114a-114f to crowd around it at selected place settings (i1-i6) of the indicators 110a-110f and interact with the common workspace 109a displayed by multi-touch-screen 108a. Although the multi-touch screen 108a is described, by way of example only, to be at least 46" to allow at least 6 users 114a-114f to crowd around it, it is to be appreciated by the skilled person that the multi-touch screen 108a is not limited to be sized to only 46", but that the multi-touch screen 108a and also the common workspace 109a may be of any size that is suitable for a specified or predefined number of users.
[00235] The indicators 110a-110f may be embedded lights (i1-i6) located around the edge or periphery of the table. The indicators 110a-110f instructs the users 114a-114f how to place themselves around the table. For example, a user 114a may come to the multi-touch platform 102a (e.g. the multi-touch table) and automatically log-into it via their wearable, if they have one. The user's personal preferences and required documents for that communication session will be automatically uploaded to the multi-touch platform 102a. An indicator 110a, which may be a single light located at the table edge, will activate instructing where the user 114a should be placed. Subsequent users 114b-114f may log into the multi-touch platform 102a (or table) via their wearable devices, if they have any, and the appropriate indicator 110b-110f light is activated to inform of each user 114b-114f of their place setting around the multi-touch platform 102a. Instead of using wearable devices to log into the multi-touch platform 102a, the mobile device of a user 114b may be used to log into the multi-touch platform 102a or the multi-sharing information system if required. Alternatively or additionally, users 114a-114f may manually log into the multi-touch platform 102a. For example, on touching a graphical icon or token in the common workspace 109a, and/or if there are any user workspaces, then touching an inactive workspace on the multi-touch screen 108a, a login screen may appear with an on screen keyboard for manually logging into the multi-touch platform 102a.
[00236] As described with reference to figure 1g, more indicators or lights may be embedded underneath the multi-touch table 102a to instruct the users as to the multi-touch platform's 102a operational status. For example, no lights on may indicate the multi-touch platform 102a is "off'; lights on may indicate the multi-touch platform 102a is operational and standing by for use; and lights that switch to another colour may indicate that multi-touch platform 102a is being used; lights may also switch to another different colour to indicate that the communication session is completed; further coloured lights may indicate an error or maintenance of the multi-touch platform 102a may be required.
[00237] Figure 2a is a schematic diagram of an example multi-touch platform 202 which may be implemented as any form of a computing or electronic device for use in the multi sharing information system 100. Computing-based device 202 comprises one or more processors 204 which may be microprocessors, controllers or any other suitable type of processors for processing corn puter executable instructions to control the operation of the device 202 in order to receive one or more data items (e.g. files) from a server (not shown) and/or mobile device (not shown) using communication interface 206 via network 205, and to receive user input commands from display device(s) 208 and/or user input devices 210. The one or more data items (e.g. files) from the server and/or mobile device may be cached in memory 212 for processing and/or manipulation by one or more users as described herein [00238] In some examples, for example where a system on a chip architecture is used, the processors 204 may include one or more fixed function blocks (also referred to as accelerators) which implement at least a part of the methods and/or apparatus as described herein in hardware (rather than software or firmware). The memory 212 may include platform software and/or computer executable instructions comprising an operating system 212a or any other suitable platform software may be provided at device 202 to enable application software 212b and 212d to be executed on the device 202. Depending on the functionality and capabilities of the device 202 and application of the device 202, software and/or computer executable instructions may include the functionality and/or methods as described with reference to the multi-touch platform as described in any one of figures la-1g and/or 2b10, which may be embodied in a multi-touch platform application and/or interface 212b and one or more apps or applications 212d (e.g. pdf document reader/annotation, file/document editors/viewers, file managers, call apps, audio/video player applications, third party software and applications) for, by way of example only but not limited to, reading and/or working on and/or manipulating files and/or collaborating with users in a communication session.
[00239] The multi-touch platform application and/or interface 212b may be client software (e.g. a common client as described below) for operating the multi-touch platform 202, displaying a common workspace on the display device 208, and also for interacting with a server apparatus and mobile devices etc. The data store 212c may be used for storing received files and/or commands or functionality from a server and/or mobile devices via a network 205 using communications interface 206. The one or more apps or applications 212d may include any type of application or third party software or application that may execute on the multi-touch platform 202 for use by users of the multi-touch platform 202 via the common workspace during a communication session or be uploaded to the common workspace (or if any, a user work space) on the multi-touch platform 202 etc. The users may simultaneously use the common workspace, the one or more apps or applications, and third party software or applications for working on and/or transferring one or more data items. The one or more applications 212d may be downloaded and/or accessed via a common application store or from a menu item displayed on the common workspace displayed on display device 208.
[00240] For example, multi-touch platform 202 may be used to implement the multi-touch platform app and/or interface 212b, which includes the functionality of the example multi-touch platforms as described herein with reference to Figures la-1g and 2b(i)-10 and may include software and/or computer executable instructions that may include multi-touch platform app/interface functionality or mechanisms. Computer storage media may include, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device. In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Although the computer storage media (memory 212) is shown within the device 202 it will be appreciated that the storage may be distributed or located remotely and accessed via a network 205 or other communication link (e.g. using communication interface 206).
[00241] The multi-touch platform 202 includes an input/output controller 214 arranged to take input from users and/or output display information from/to a display device 208 which may be separate from or integral to the multi-touch platform 202. The display information may provide a graphical user interface including the common workspace that can be simultaneously used by multiple users and, if any, one or more user workspaces. The display device 208 also acts as the user input device 210 as it is a touch sensitive display device. The input/output controller 214 may be optionally also arranged to receive and process input from one or more devices, such as a user input device 210 (e.g. a mouse or a keyboard, an audio/visual system that may include one or more cameras or image capture devices, one or more speakers and/or microphones). For example, the user input device 210 may include an audio/visual system that includes a camera or image capture device, at least one speaker and/or microphone for each of a predetermined number of users located around the multi-touch platform, or for each designated location around the multi-touch platform. An audio/visual application may be used to on the display device 208 to access the corresponding user input devices 210 for performing an audio/video call (e.g. a Skype (RTM) call) or a video conference call between one or more users that have joined a communication session at the multi-touch platform 202.
[00242] In another example, a call function may be implemented on the multi-touch platform 202 using touch screen gestures or other user input on the display device 208. The call function may be a video call function (e.g. Skype (RTM) or Face Time (RTM)) or simply an audio call function. For example, a user (e.g. user 114a) may click or perform a touch screen gesture (e.g. "1"-finger touch and/or an extended touch) in the common workspace on a connected user icon representing another user in the communication session that is displayed on the display device 208. The user may select a call user icon or graphical icon in which the multi-touch platform 202 sets up a call with the other user. For example, the multi-touch platform 202 may configure the communication interface 206 to use standard voice over Internet Protocol to call the other user in which audio may be presented over user input device 210 to the user and, if any, video may be presented to the user via the common workspace on the display device 208 or alternatively or additionally within a user workspace of the user Of any). The multi-touch platform 202 may call the called user (e.g. connected user) via a local server or a server in the network. Once connected, the user and the called user may communicate with each other via the communications interface 206 over the network.
[00243] The user input may be used to operate the multi-touch platform 202 for performing management, control and manipulation of files etc. and for collaboration amongst multiple users. The input/output controller 214 may also output data to devices other than the display device, e.g. a locally connected printing device.
[00244] The multi-touch platform 202 may be designed and configured to provide a true collaborative multi-touch-based environment where all users are active and share individually acquired information via a common workspace in a common, centralised forum. Information can be simultaneously presented to all users on the common workspace on the display device 208 via the multi-touch-platform 202 and discussed, worked on collectively thus significantly improving team performance and the end work product. The multi-touch platform 202 allows for active participation via users being able to use gestures to transfer data, e.g. swipe data ("swish" files), to and from the common workspace of the multi-touch-platform 202. For example, users may transfer data such as one or more of the associated data items (e.g. files) to another multi-touch platform (e.g. multi-touch platform 102b), or to other mobile device(s) for working on and/or manipulation by user(s) of the other mobile device(s) or multi-touch platforms.
[00245] Multi-touch platform 202 may be configured such that data may also be presented by the users to the common workspace of the multi-touch platform 202. Once the data is presented ("swished") to the common workspace of the multi-touch-platform 202, the data can be "grabbed" by one or more users, orientated for reading and manipulated accordingly. In addition, a first user may present via the common workspace one or more associated files to each user (i.e. like a paper folder bring passed around a board/meeting table), whom may be able to manipulate the associated data. A user can also use gestures on the multi-touch platform 202 to transfer a data item (e.g. a file) to their mobile device (e.g. a "swish" gesture may be used to transfer the file to their mobile device) for working on in their private space ("pull-back") or user workspace on the mobile device. That user may then "swish" any worked on documents back to the common workspace of the multi-touch-platform 202. Whole team reports can be written via the common workspace of the multi-touch-platform 202 and submitted to parties via communication interface 206 to a network connection of network 205.
[00246] An advantage of the present invention provides a technologically enhanced, touch-based, boardroom and/or meeting room table environment. Again all meeting attendees can connect their mobile devices to the multi-touch-platform 202 (e.g. structured in the form of a table with a touch screen); "swish" and "grab" files; share worked on files/new information by "swishing" back to the multi-touch-platform 202.
[00247] Figure 2b(i) is a flow diagram of an example method 220 performed by multi-touch platform 202 of figure 2a or by any of the multi-touch platforms as described with reference to figures la-2a and 2b(ii)-10. The multi-touch platform 202 enables a plurality of users to collaborate on a plurality or set of data items via a common workspace during a communication session with the multi-touch platform 202. The multi-touch platform 202 also includes processor 204 and a communication interface 206 for communicating with at least one of the plurality of users with a mobile device of a user or other multi-touch platform external to the multi-touch platform 202. The users at the multi-touch platform 202 may simultaneously use the common workspace to work on data items during the communication session. The method 220 is based on the following: [00248] At step 221, receiving, by the display device 208, which is a touch screen, one or more touch screen gestures in the common workspace from the one or more users working on a set of data items. The one or more touch screen gestures are associated with at least one of the data items. The one or more touch screen gestures performed by the users in the common workspace may define operations to be performed on the corresponding data items.
[00249] At step 222, processing each of the one or more data items according to each associated received touch screen gesture. The processing may include or comprise performing the operations defined by the touch screen gestures on the data items.
[00250] At step 223, displaying the data items in the common workspace based on the corresponding touch screen gesture, where the one or more users use the common workspace to perform further work or operations on the set of data items. Depending on the operations performed by the touch screen gestures on the data items, the changes to those data items that should be displayed in the common workspace are displayed. For example, the a touch screen gesture may correspond with a "grab" type gesture for re-orienting a data item displayed on the common workspace, thus the processing of the operation performed on the data item is a rotation of the data item in the common workspace. The changes caused by the rotation are displayed on the common workspace, and the data item is displayed re-oriented according to the "grab" type gesture.
[00251] Figure 2b(ii) is a flow diagram of another example method 225 performed by multi-touch platform 202 of figure 2a or by any of the multi-touch platforms as described with reference to figures la-213(i) and 2b(iii)-10. The multi-touch platform 202 may be configured for communicating with at least one of the plurality of users at the multi-touch platform 202, or with a mobile device of a user of the multi-touch platform, or to a mobile device of a remote user to the multi-touch platform 202 via communication interface 206 and network 205, or with other multi-touch platforms external to the multi-touch platform 202 via communication interface 206 and network 205. The users at the multi-touch platform 202 may simultaneously use the common workspace to work on data items and/or transfer data items to other users during the communication session. The method 225 is based on the following: [00252] At step 226, receiving a touch screen gesture in the common workspace of the display device 208 that is associated with transferring at least one data item from the multi-touch platform 202 to a device associated with a user in the communication session. For example, depending on the touch screen gesture, the multi-touch platform 202 may distinguish between transferring one or more data items to a mobile device of a user of the multi-touch platform 202, a mobile device of a remote user to the multi-touch platform 202, or with other multi-touch platforms external to the multi-touch platform 202 via communication interface 206 and network 205. The device may be a mobile device or other multi-touch platforms. Depending on the configuration of a multi-sharing information system including the multi-touch platform 202 and the number of different types of device, there may be several different touch screen gestures associated with transferring the at least one data item to a device associated with the user.
[00263] At step 227, in response to determining the touch screen gesture is associated with transferring one or more data items to a device, the multi-touch platform 202 transfers or transmits, via the communication interface 206 and the network 205, the one or more data items from the multi-touch platform to the corresponding device associated with a user of the plurality of users.
[00254] At step 228, receiving, by the multi-touch platform 202 from a device associated with one of the plurality of users, one or more further data items for the one or more users to work on in the common workspace of the multi-touch platform 202. The method may proceed back to step 226 to await further touch screen gestures.
[00265] Figure 2b(iii) is a flow diagram of an example method 230 performed by multi-touch platform 202 of figure 2a or by any of the multi-touch platforms as described with reference to figures la-2b(ii) and 2b(iv)-10. The multi-touch platform 202 enables a plurality of users to collaborate, via a common workspace displayed on display device 208, on a plurality of data items (e.g. files) in a communication session with the multi-touch platform 202. The multi-touch platform 202 also includes processor 204 and a communication interface 206 for communicating with at least one of the plurality of users with a mobile device of a user or other multi-touch platform external to the multi-touch platform 202. The method 230 is based on the following: [00256] At step 232, detecting a touch screen gesture in the common workspace of the multi-touch platform associated with one or more of the data items (e.g. files) from a first user of the plurality of users. The touch screen gesture is for transferring said one or more data items from the first user to a second user of the plurality of users. For example, the touch screen gesture may be defined to be "swish" touch screen gesture, which is detected by determining when an object (e.g. stylus or finger) of the first user touches the common workspace displayed on the touch screen table top and rapidly moves the object in the direction associated with the second user.
[00257] At step 234, determining a direction of the detected touch screen gesture. For example, the direction associated with the second user may be based on a vector determined to be pointing from the first user towards the second user. Alternatively or additionally, the direction associated with the second user may be based on a vector determined to be pointing from the first user towards an indicator associated with the second user. For example, the indicator associated with the second user may be a user interface displayed on the common workspace of the touch screen in which the user interface is associated with the second user. Alternatively or additionally, the indicator associated with the second user may include or be an icon displayed on the common workspace of the touch screen associated with the second user.
[00258] At step 236, identifying the direction of the touch screen gesture is associated with the second user.
[00259] At step 238, transferring the one or more data items to the second user during the communication session. For example, the second user may be one of the users with a mobile device external to the multi-touch platform, and transferring the one or more data items to the second user further includes transferring the one or more data items to the mobile device of the second user.
[00260] As another example, the first user is one of the users with a mobile device external to the multi-touch platform and may be the second user, where step 234 may further include determining the direction of the detected touch screen gesture to be a direction associated with the first user, where transferring the one or more data items further comprises transferring the one or more files to the mobile device of the first user.
[00261] The multi-touch platform 202 may include a display device 208 comprising a touch screen displaying a common workspace with user interfaces positioned around the periphery of the common workspace. The common workspace may be displayed over the entire usable or displayable surface of the touch screen of the display device 208. Alternatively or additionally, the common workspace and/or touch screen may include designated locations spaced around the periphery of the common workspace and/or touch screen for a predetermined number of the plurality of users, wherein the user interface may be positioned at or near the designated locations. The user interface may further include a user workspace such that the designated locations may each correspond to a user workspace for use by users in working on the one or more files associated with the communication session.
[00262] As described with reference to figures lb(ii)-(vi), the multi-touch platform 202 may determine the designated location in the common workspace and/or touch screen for the user interface and/or user workspace around the multi-touch platform 202 or the address of a mobile device or other multi-touch platform 202 of each of the plurality of users in the communication session at the time each user joins the communication session. Additionally or alternatively, the user may select a location in the common workspace and/or touch screen for the user interface at the time each user joins the communication session. From this, the multi-touch platform 202 determines and/or registers a user identity for each user joining the communication session (e.g. based on their login credentials) and associates the user identity for each user with the assigned designated location or the user selected location, and optionally assigned user workspace, and/or the address of the user's mobile device and/or other multi-touch platform.
[00263] For example, each user joins the communication session by logging into the multi-touch platform 102a by providing login credentials allowing the multi-touch platform 102a to authenticate each user to ensure that user is permitted to join the communication session.
The multi-touch platform 102a may then determine and store a user identity for each user based on the login credentials of each user and associate the user identity with the location of the user and/or the address of the mobile device or other multi-touch platform 102b. This enables the multi-touch platform 102a to communicate with each user and/or allow each user to work on one or more data items (e.g. files or data) associated with the communication session.
[00264] Figure 2c is a flow diagram of an example method 240 for initiating a communication session on multi-touch platform 202 of figure 2a or by any of the multi-touch platforms as described with reference to figures la-10. The multi-touch platform 202 enables a plurality of users to collaborate on a plurality of files in the communication session with the multi-touch platform 202. The multi-touch platform 202 includes a display device 208 comprising a touch screen for displaying a common workspace. The multi-touch platform 202 may be configured to have a pre-determined number of designated locations for user interfaces around the periphery of the common workspace and/or touch screen for one or more of the plurality of users. Additionally or alternatively, the multi-touch platform 202 may be configured to have locations for user interfaces spaced around the periphery of the common workspace and/or touch screen to be selected by one or more of the plurality of users. The multi-touch platform 202 also includes a processor 204 and a communication interface 206 for communicating with at least one of the plurality of users with a mobile device external to the multi-touch platform 202. The method 240 is based on the following: [00265] At step 241, initiating the communication session on request by a first user of the plurality of users for starting the collaboration.
[00266] At step 242, uploading to the multi-touch platform a set of data items or a plurality of data items (e.g. files), where the set of data items is associated with the first user and/or the communications session initiated by the first user.
[00267] At step 243, authenticating one or more users of the plurality of users requesting to join the communication session.
[00268] At step 244, determining at least one of the authenticated users requests a place at the multi-touch platform, and if at least one of the authenticated users requests a place at the multi-touch platform, then at step 244a assign or register either a designated location or a user selected location around the periphery of the common workspace and/or touch screen table top for the at least one of the authenticated users, proceed to step 247, Otherwise proceed to step 245.
[00269] At step 245, determining at least one of the authenticated users requests to join the communication session via a mobile device, and if at least one of the authenticated users requests to join via the mobile device, then at step 245a assign or register the at least one authenticated user with the mobile device address of the at least one authenticated user, proceed to 247. Otherwise proceed to 246.
[00270] At step 246, determining at least one of the authenticated users requests to join the communication session via another multi-touch platform different to the multi-touch platform 202, if so, then at step 246a assign or register the address of the another multi-touch platform in relation to the at least one authenticated user, where the another multi-touch platform assigns/registers a designated location or a user selected location around the periphery of the common workspace and/or touch screen of the another multi-touch platform for the user, proceed to 247.
[00271] At step 247, start the communication session in which the users that have joined the communication session may begin to collaborate on the plurality of data items associated with the first user or the communication session. For example, users at the multi-touch platform 202 may simultaneously user the common workspace of the multi-touch platform 202 to work on and collaborate on the plurality of data items associated with the communication session or the first user.
[00272] Figure 2d is a flow diagram of an example method 250 for allowing multiple users to work on a data item (e.g. file) during the communication session with the multi-touch platform 202 of figure 2a or by any of the multi-touch platforms as described with reference to figures la-10. The multi-touch platform 202 enables a plurality of users to collaborate on a set of data items (e.g. a plurality of files) in the communication session. The multi-touch platform 202 includes a display device 208 comprising a touch screen that displays a common workspace. The multi-touch platform 202 may be configured to have a predetermined number of designated locations for positioning user interfaces around the periphery of the common workspace and/or touch screen for one or more of the plurality of users. Additionally or alternatively, the multi-touch platform 202 may be configured to allow users to select a location for positioning their user interface around the periphery of the common workspace and/or touch screen. The multi-touch platform 202 also includes processor 204 and a communication interface 206 for communicating with at least one of the plurality of users with a mobile device external to the multi-touch platform 202. The method 250 is based on the following: [00273] At step 252, detecting whether multiple users of the plurality of users are to work on a data item of the set of data items substantially simultaneously.
[00274] At 254, providing each of the multiple users with a copy of the data item.
[00275] At 256, receiving one or more of the copies of the data items worked on by the multiple users.
[00276] At step 258, determining the changes between each copy of the data item with the original data item.
[00277] At step 260, generating a temporary data item by merging the original data item with the changes made to the one or more copies of the data item.
[00278] At step 262, verifying the changes are acceptable to the multiple users prior to finalising the temporary data item.
[00279] Figure 2e is a flow diagram of an example method 270 for allowing multiple users to work on a file during the communication session with the multi-touch platform 202 of figure 2a or by any of the multi-touch platforms as described with reference to figures la-1f. The multi-touch platform 202 enables a plurality of users to collaborate on a set of data items (e.g. a plurality of files) in the communication session. The multi-touch platform 202 includes a display device 208 comprising a touch screen for displaying a common workspace for use by the plurality of users. The common workspace may be used simultaneously by the plurality of users. The multi-touch platform 202 may be configured to have a predetermined number of designated locations for positioning user interfaces around the periphery of the common workspace and/or touch screen for one or more of the plurality of users. Additionally or alternatively, the multi-touch platform 202 may be configured to allow users to select a location for positioning their user interface around the periphery of the common workspace and/or touch screen. The multi-touch platform 202 also includes processor 204 and a communication interface 206 for communicating with at least one of the plurality of users with a mobile device external to the multi-touch platform 202. The method 250 is based on the following: [00280] At step 272, detecting whether multiple users of the plurality of users are to work on a data item of the set of data items substantially simultaneously.
[00281] At step 274, allowing each of the multiple users to work on the data item serially.
[00282] At step 276, verifying the changes are acceptable to the multiple users prior to finalising the temporary data item.
[00283] Figure 3 is a schematic diagram of an example server 300 which may be implemented as any form of a computing or electronic device for use in the multi sharing information system 100. Computing-based device 300 comprises one or more processors 302 which may be microprocessors, controllers or any other suitable type of processors for processing computer executable instructions to control the operation of the device 300 in order to receive a set of data items or one or more data items (e.g. files) from multi-touch platforms 102a, 102b and/or 202 and/or mobile device using communication interface 304 via a network 305, and to receive input commands from multi-touch platform 202 and/or user input commands from user input device 306. The one or more data items (e.g. files) from the multi-touch platform 202 and/or mobile device may be cached in data store 308c of memory 308 for storage, processing and/or manipulation by one or more users as described herein.
[00284] In some examples, for example where a system on a chip architecture is used, the processors 302 may include one or more fixed function blocks (also referred to as accelerators) which implement at least a part of the methods and/or apparatus as described herein in hardware (rather than software or firmware). The memory 308 may include platform software and/or computer executable instructions comprising an operating system 308a or any other suitable platform software and one or more apps or applications 308d that may be provided at device 300 to enable application software 308b and 308d to be executed on the device 300. Depending on the functionality and capabilities of the device 300 and application of the device 300, software and/or computer executable instructions may include the functionality of the servers as described with reference to figures la-10 implemented as a server multi-touch platform application and/or interface 308b for distributing and storing files for use by users when collaborating in a communication session.
L002851 The server multi-touch platform application and/or interface 308b may be server-client software (e.g. an uploader client as described below) for operating the server apparatus 300 and also for interacting with multi-touch platform 202 and/or mobile devices etc., (e.g. uploading/downloading data/files etc. to multi-touch platform 202 and/or mobile devices etc.) The data store 308c may be used for storing received data items (e.g. files) and/or commands or functionality via network 305 using communications interface 304. The one or more apps or applications 308d may include any type of application or third party software or application that may be uploaded for execution on a multi-touch platform 202, and/or executed on the server 300 as a software as a service application(s) for use by users of the multi-touch platform 202, mobile or remote users using mobile devices and/or other multi-touch platforms during a communication session.
[00286] The one or more apps or applications 308d may include, by way of example only but is not limited to, pdf document reader/annotation, file/document editors/viewers, file managers, call apps, audio/video player applications, third party software and applications for, by way of example only but not limited to, reading and/or working on and/or manipulating data items (e.g. files, documents and or digital data) and/or collaborating with users in a communication session. One or more of the applications 308d may be uploaded to a common workspace on the multi-touch platform 202 etc. The one or more applications 212d may be downloaded and/or accessed via a common application store hosted by the server 300 or via a web portal or web site hosted by the server 300 or one or more other servers or web servers etc. [00287] For example, server or computing device 300 may be used to implement server multi-touch platform app and/or interface 308b that implements and/or includes the functions or functionality of the server or similar apparatus as described herein with reference to any of Figures la-10 and may include software and/or computer executable instructions that may include server app/interface functionality or mechanisms. Computer storage media may include, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device. In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Although the computer storage media (memory 308) is shown within the device 300 it will be appreciated that the storage may be distributed or located remotely and accessed via a network 305 or other communication link (e.g. using communication interface 304).
[00288] The server or computing device 300 includes an input/output controller 310 arranged to take input from users and/or output display information from/to a display device 312 which may be separate from or integral to the server 300. The input/output controller 310 may be optionally also arranged to receive and process input from one or more devices, such as a user input device 314 (e.g. a mouse or a keyboard). This user input may be used to operate the server or computing device 300 for performing management, control and manipulation of data items (e.g. files etc.) for collaboration amongst multiple users. The input/output controller 310 may also output data to devices other than the display device, e.g. a locally connected printing device.
[00289] An example method or process performed by the server 300, which includes storage for storing a set of data items (e.g. a plurality of files, documents and/or digital data) for use during a communication session at a multi-touch platform 202 as described herein with reference to figures la-2d over network 305 may include the following operations: [00290] The server 300, during the communication session may be operable to transfer a set of data items and/or one or more data items from a first user to a second user when the multi-touch platform 202 detects, via the common workspace displayed on the touch screen of the display device 208, a touch screen gesture for transferring one or more data items from the first user to the second user. If the second user is one of the users with a mobile device external to the multi-touch platform 202, then the server 300 may transfer the one or more data items to the mobile device of the second user. If the second user is the first user with a mobile device external to the multi-touch platform, the server 300 may transfer the one or more data items to the mobile device of the first user (e.g. the first user may have "swished" one or more data items to their mobile device).
[00291] The server 300 may be operable to authenticate one or more users of the plurality of users requesting to join the communication session. On initiation of a communication session, the server 300 may be operable to receive a request to upload a set of data items (e.g. a plurality of files etc.) to the multi-touch platform 202 for initiating the communication session on request by a first user of the plurality of users. The server 300 then uploads to the multi-touch platform 202 the set of data items, where the set of data items may be associated with the first user and/or the communication session initiated by the first user.
[00292] Figure 4 is a schematic diagram of an example mobile device 400 which may be implemented as any form of a computing or electronic device for use in the multi sharing information system 100. Computing-based device 400 comprises one or more processors 402 which may be microprocessors, controllers or any other suitable type of processors for processing computer executable instructions to control the operation of the device 400 in order to receive one or more data items from a server and/or multi-touch platforms using communication interface 404, and to receive user input commands from display device(s) 406, which may be a touch screen for displaying a user workspace, and/or user input devices 408. The one or more data items from the server and/or multi-touch platform may be cached in memory 410 for processing and/or working on or manipulation by the user of the mobile device 400.
[00293] In some examples, for example where a system on a chip architecture is used, the processors 402 may include one or more fixed function blocks (also referred to as accelerators) which implement at least a part of the methods and/or apparatus as described herein in hardware (rather than software or firmware). The memory 410 may include platform software and/or computer executable instructions comprising an operating system 410a or any other suitable platform software that may be provided at device 400 to enable application software 410b and 410d to be executed on the device 400. Depending on the functionality and capabilities of the device 400 and application of the device 400, the software and/or computer executable instructions may include the functionality as described with reference to any of figures la-10 in relation to a mobile device 400 interacting with a multi-touch platform 202 or multi-user information sharing system as described herein, which may be implemented as a mobile multi-touch platform application and/or interface 410b and one or more apps 410d for reading, working on and/or manipulating files via the user workspace during the communication session. The mobile multi-touch platform application and/or interface 410b may be client software (e.g. a mobile client as described below) for operating the mobile device 400 and for interacting with the multi-touch platform 202 and/or server apparatus 300 and other mobile devices and other multi-touch platforms etc. The data store 410c may be used for storing received data items (e.g. files) and/or commands or functionality via network 405 using communications interface 404.
[00294] The one or more apps or applications 410d may include any type of application or third party software or application that may be uploaded for execution on a mobile device 400, and/or executed via a server 300 as a software as a service application(s) for use by users of the multi-touch platform 202, mobile or remote users using mobile device 400 and/or other multi-touch platforms during a communication session. The one or more apps or applications 410d may include, by way of example only but is not limited to, pdf document reader/annotation, file/document editors/viewers, file managers, call apps, audio/video player applications, third party software and applications for, by way of example only but not limited to, reading and/or working on and/or manipulating files, documents and or data and/or collaborating with users via the user workspace in a communication session. One or more of the applications 410d may be uploaded to the user workspace on the mobile device 400 etc. The one or more applications 410d may also be downloaded and/or accessed via a common application store hosted by a multi-touch platform 202, a server 300 and/or via a web portal and/or web site hosted by the server 300 or one or more other servers or web servers etc. L002951 For example, mobile device 400 may be used to implement mobile multi-touch platform app and/or interface 410b as described herein with reference to Figures la-10 and may include software and/or computer executable instructions that may include mobile multi-touch platform app/interface functionality or mechanisms. Computer storage media may include, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device. In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Although the computer storage media (memory 410) is shown within the device 400 it will be appreciated that the storage may be distributed or located remotely and accessed via a network or other communication link (e.g. using communication interface 404).
[00296] The mobile device 400 includes an input/output controller 412 arranged to take input from users and/or output display information from/to a display device 406 which may be separate from or integral to the mobile device 400. The display information may provide a graphical user interface including user workspace. The display device 406 may also be configured to act as the user input device 408 as it can be a touch sensitive display device.
The input/output controller 412 may be optionally also arranged to receive and process input from one or more devices, such as a user input device 408 (e.g. a mouse or a keyboard). This user input may be used to operate the mobile device 400 for performing management, control and manipulation of files etc. for collaboration amongst multiple users with in the multi-sharing information system 100. The input/output controller 412 may also output data to devices other than the display device, e.g. a locally connected printing device.
[00297] As an example, the mobile device 400 may be operable to collaborate with a plurality of users on a set of data items (e.g. a plurality of files) in a communication session with a multi-touch platform 202 as described herein with reference to figures la-10 over network 405. The mobile device 400 may also be operable to communicate with a server 300 over network 405. The mobile device 400 may include a touch screen display device 406 for displaying a user workspace and a communication interface 404 for communicating with the multi-touch platform 202 and/or server 300. The mobile device may be operable to perform the following method(s) and/or processes: [00298] The mobile device 400 may be configured to perform detecting a touch screen gesture associated with one or more of the data items from the user, where the touch screen gesture is for transferring said one or more data items from the user of the mobile device to one of the plurality of users of the multi-touch platform 202. The mobile device 400 may be configured to determine a direction of the detected touch screen gesture and identify that the direction of the touch screen gesture is associated with one of the plurality of users of the multi-touch platform. The mobile device 400 may then transfer the one or more data items to one of the plurality of users or the multi-touch platform 202 during the communication session.
This may be via the server 300 over the communication network 405.
[00299] The mobile device 400 may detect the touch screen gesture for transferring one or more data items by detecting the first user touching the touch screen in the user workspace with an object of the first user in the vicinity of said one or more data items (e.g. files) and, prior to the first user releasing the object from the touch screen, detecting the object moving in a direction associated with one of the plurality of users or the multi-touch platform 202 as described herein with reference to figures la-10 [00300] The mobile device 400 may further be configured to send a request to the multi-touch platform 202 requesting the user joins the communication session.
[00301] The mobile device 400 may be operable to notify the multi-touch platform 202 that the user of the mobile device 400 is in the presence of the multi-touch platform 202. Then mobile device 400 may then connect to the multi-touch platform 202 for authenticating the user of the mobile device 400 and to log the user into the multi-touch platform 202 to join the communication session. The mobile device 400 may wirelessly communicate with the multi-touch platform 202 using near far communications signals.
[00302] Alternatively or additionally, the mobile device 400 may receive a copy of a data item in response to the multi-touch platform 202 detecting whether multiple users of the plurality of users are to work on a data item in the set of data items substantially simultaneously. The mobile device 400 may then transmit the copy of the data item worked on by the user of the mobile device 400 to the multi-touch platform 202 for generating a temporary data item comprising the merged content of the original data item and the copy of the data item worked on by the multiple users. The mobile device 400 may be operable to assist the user to verify the changes are acceptable and notify the multiple users prior to the multi-touch platform 202 finalising the data item.
[00303] Alternatively or additionally, the mobile device 400 may receive a copy of a data item in response to the multi-touch platform 202 detecting whether multiple users of the plurality of users are to work on a data item in the set of data items substantially simultaneously, where each of the multiple users is allowed to work on the data item serially. The mobile device 400 may also transmit the copy of the data item to the multi-touch platform, i.e. after the user of the mobile device 400 has worked on the copy of the data item. The mobile device 400 may be operable to assist the user to verify the changes are acceptable and notify the multiple users prior to the multi-touch platform 202 finalising the data item.
[00304] Figure 5 is a schematic diagram showing another example multi user information sharing system 500 including, by way of example only, first and second multi-touch platforms 502a and 502b including touch screen display devices 508a and 508b each of which display a common workspace 509a and 509b, respectively. The multi user information sharing system further includes first and second server apparatus 504a and 504b, first and second sets of mobile device(s) 506a and 506b, a plurality of users 514a-514c and 515a-515c. The server apparatus 504a and 504b stores, manages and transfers data items (e.g. files and/or data) associated with the plurality of users 514a-514c and 515a-515c. The first multi-touch platform 502a is configured to be connected to the servers 504a and 504b can communicate directly with mobile devices 506a or via servers 504a and 504b (not shown). The first multi-touch platform 502a is also configured to communicate via server 504b with mobile devices 506b and/or second multi-touch platform 502b. Similarly, the second multi-touch platform 502b is configured to be connected to the server 504b and can communicate directly with mobile devices 506b (not shown) or via server 504b. The second multi-touch platform 502a is also configured to communicate via server 504a and/or 504b with second multi-touch platform 502a and also with mobile devices 506a via second multi-touch platform 502a.
[00306] The multi-user information sharing system 500 is a client-server system based on a "multi-client to multi-server" frame-work in which software clients are installed on the multi-touch platforms 502a and 502b and mobile devices 506a and 506b. The multi-user information sharing system 100 allows, by way of example, multiple users 114a-114c and 115a-115c to fully collaborate via common workspaces 509a and 509b and/or mobile devices 506a and/or 506b and share with each other via, by way of example, but not limited to, sharing data items (e.g. data and/or files), collaboration, working on the data items (e.g. data and/or files), and creating further data items (e.g. reports and/or finalised files).
[00306] As described with reference to Figures 2a-2e, multi-touch platform 202 includes a software client comprising software and/or computer executable instructions that includes a multi-touch platform application (e.g. a software client) and/or interface and one or more apps or applications. As described with reference to Figure 3, server apparatus 300 includes server software and/or computer executable instructions that include server multi-touch platform app and/or interface 308b for interacting with the multi-touch platform 202 and/or mobile devices 400. As described with reference to Figure 4, mobile device 400 includes client software such as software and/or computer executable instructions that includes as a mobile multi-touch platform application and/or interface and one or more apps for reading, working on and/or manipulating files during the communication session and for interacting with the multi-touch platform 202 and/or server apparatus 300.
L00307] Figure 6a0) and (ii) are schematic diagrams illustrating an example multi-user information sharing system 600 based on a "multi-client to multi-server" frame-work in which software clients 604, 606b, 606c, 608 and 610 are installed on the multi-touch platforms 502a and 502b and/or mobile devices 506a and 506b.
[00308] For example, in figure 60), users can send data items 602 (e.g. data/files) to the multi-touch platform 606a (e.g. MTT1) via an uploader client 604 before or "in-real-time" at the working communication session. The uploader client 604 can be accessed via a user's personal computer and/or mobile device. The uploader client 604 (e.g. server multi-touch platform application and/or interface 308b) is a software component that may be installed on a shared and/or secure server. The uploader client 604 configures the server to allow data items (e.g. data/files) to be uploaded to/from the common client 606b of the multi-touch platform 606a from any site, server or to/from one or more mobile device(s) connected to the server or multi-touch platform 606a (e.g. connected via the Internet or any other communication network to the shared server). The uploader client 604 may implement the functionalities and/or methods described herein for the server or server apparatus as described herein with reference to any of figures la-10.
[00309] The common client 606b (e.g. multi-touch platform application and/or interface 212b) is a software component that may be installed on the multi-touch platform 606a. The common client 606b configures the multi-touch platform 606a to allow data items (e.g. data and/or files) to be displayed in the common workspace and worked on via the common workspace as displayed by the display device (e.g. a multi-touch screen) of the multi-touch platform 606a. The common client 606b also configures the multi-touch platform 606a to allow users to interact via the common workspace and display device with the multi-touch platform 606a via a library of gestures. For example, the common client 606b may configure the multi-touch platform 606a to allow data items (e.g. data and/or files) to be the orientated to user preferences via "grab and twist" touch based gestures as described with reference to any of figures la-10. The common client 606b may also configure the multi-touch platform 606a to allow data items (e.g. data and/or files) to be transferred between users via 'swipe" and/or "swish"touch based gestures as described with reference to any of figures la-10. The common client 606b may implement the functionalifies and/or methods described herein for the multi-touch platforms as described herein with reference to any of figures 1a-10.
[00310] The mobile client 608 (e.g. mobile multi-touch platform application and/or interface 410b) is a software component that may be installed on a touch based mobile device. The mobile client 608 may configure the mobile device to allow data items (e.g. data and/files) (and annotations) to be "swiped" or "swished" and transferred to and from the common client 606b of the multi-touch platform 606a. For example, the mobile client 608 may configure the mobile device to allow data items (e.g. data and/or files) to be the orientated to user preferences within a user workspace displayed on the mobile device via "grab and twist" touch based gestures as described with reference to any of figures la-10. The mobile client 608 may also configure the mobile device to allow data items (e.g. data and/or files) to be transferred between users via "swipe" and/or "swish"touch based gestures as described with reference to any of figures la-10. The mobile client 608 may also be used to operate the mobile device and for interacting with the multi-touch platform 606a (e.g. common client 606b) and/or server apparatus (e.g. via uploader client 604) and other mobile devices and other multi-touch platforms etc. The mobile client 608 may implement the functionalities and/or methods described herein for the mobile device as described herein with reference to any of figures 1a-10.
[00311] Figure 6a0i) is a schematic diagram of the "multi-client to multi-server" framework of the multi-user information sharing system further showing a locator client 610 that allows users to automatically log into the multi-touch platform 606a. The locator client 610 provides the multi-touch platform 606a with log-in and display data allowing users to log into the multi-touch platform 606a and be allocated a workspace. The locator client 610 may also provide user preferences that may be used by the multi-touch platform 606a to configure. if any, a user workspace according to the user preferences of the user. Alternatively, the locator client 510 may be used for the first user initiating a communication session at the multi-touch platform 606a to configure the common workspace according to the user preferences of the first user and/or preferences configured or associated with the communication session.
[00312] A user (e.g. a first user) at the multi-touch platform 606a may log in as the main user to the multi-touch platform 606a, in which this main user "chairs" or is responsible for leading the communication session. The locator client 610 improves system usability and allows a first user (e.g. the main user of a communication session) to automatically start-up and log into to the multi-touch platform 606a, and set-up their user preferences or preferences in relation to the communication session and/or a set of data items (e.g. files/data) that are the subject of the communication session via the uploader client 604.
[00313] The locator client 610 is a software component that may be installed on a touch based mobile device or wearable device etc. The locator client 610 may configure the mobile device or wearable to automatically log the user of the mobile device into the multi-touch platform 606a. The locator client 610 may manage user information such as, by way of example but not limited to, the user IDs or credentials, their locations, mobile device address in the communication network and/or upload the main user preferences to the multi-touch platform 606a. The locator client 610 may also be configured to initiate the uploader client 604 to automatically upload the main set of data items (e.g. user documents, files and/or data) or to upload one or more further data items of subsequent users for the communication session. The locator client 610 may provide the multi-touch platform 606a with the user information for determining/assigning or registering and/or storing a user identity for the user based on the user credentials or user ID, and associate the user identity with the location of the user, mobile device address of the user, and/or other enabling the multi-touch platform 606a to communicate with the user either via the touch screen or via the communication network to a mobile device or other multi-touch platform 606a.
[00314] Once the first user or main user has logged into the multi-touch platform 606a, and the set of data items (e.g. a plurality of documents, files and/or data) have been uploaded to the multi-touch platform 606a via the uploader client 604, the common client 606b displays the data items (e.g. documents, files and/or data) via a common workspace to all users crowded around the multi-touch platform 606a using the touch-enabled common client 606b (e.g. see figures la-1d(vi) and 2b(i)-2e). Users can work on the data items (e.g. files) on the multi-touch platform 606a using primarily the common workspace and common client 606b as described, for example, with reference to figures la-1d(vi) and 2b(i)-2e in which they can, by way of example only, but not limited to, perform the operations of transferring ("swish" finger gesture) data items between each other, remote users, other multi-touch platforms and to their mobile devices etc., orienting data items ("grab" gestures), and/or annotating and collaborating to generate a data item (e.g. a finalised file and/or report etc.) [00316] Figure 6b is a flow diagram showing a method 620 for interfacing an example mobile client 608 with an example common client 606b of the multi-user information sharing system 600 of figures Sa(i) and (i). The mobile client 608 (e.g. mobile multi-touch platform application and/or interface 410b) is a software component that may be installed on a touch based mobile device. The common client 606b (e.g. multi-touch platform application and/or interface 212b) is a software component that may be installed on the multi-touch platform 606a. In this example, the mobile client 608 initiates the communication session on the multi-touch table and interfaces with the common client 606b. The method 620 is based on the following: [00316] At step 622, a user of the mobile device initiates or joins a communication session on the multi-touch platform 606a by starting the mobile client 608 (or application associated with the mobile client 608). At step 624, the mobile client 608 executes the application (e.g. an Xcode app) that controls and operates the mobile client 608. For example, the mobile client 608 controls and operates the communications between devices, multi-touch platforms and servers, and the display of data items (e.g. files/data) and folders associated with the communication session in a user workspace and for working on the data items (files/data) and folders etc. using the user workspace.
[00317] In step 626, the mobile client 608 initiates the login procedure with the common client 606b to connect to the multi-touch platform 606a and to initiate or join a communication session. This may involve authentication of the user and/or the device via a login/password combination etc. For example, the Xcode app initiates the login procedure (e.g. using the location client 610) with the common client 606b of the multi-touch platform 606a. In this example, the multi-touch platform 606a via the common client 606a sequentially assigns a colour to each user when they log in to the multi-touch platform 606a. For example, colour may be sequentially assigned to users (e.g. Ux, where is x is the user number) on connection to the multi-touch platform 606a via the mobile client 608.
[00318] By way of example only, when the main user (e.g. U1) connects to the multi-touch platform 606a to initiate a communication session (e.g. the main user may connect via the mobile client 608 or the location client 610), and may be assigned the colour blue. All the U1 annotation interactions on data items in the common workspace of the multi-touch platform 606a (including submitting text for collation into a report) are in blue. Subsequent users (e.g. U2-Ux, where x is the number of users) are assigned other colours by the multi-touch platform 606a. For example, if there are n users, then there will be n different colours that are assigned to each of the users. In another example, each user has a mobile device or wearable device, such that each user and hence device is assigned a user colour.
[00319] In step 626, in the login procedure the mobile client 608 may request login along with authentication information and also request the user (Ux) colour that will be assigned to the user (Ux) during their communication session. This may assist and/or facilitate annotation and report writing activities during the communication session for when each user works on the one or more data items (e.g. files and/or data).
[00320] In step 628, the common client 606a receives the login request to initiate or join the communication session and, may also receive a request for the user colour. The login request may be considered an implicit request for the user colour. The common client 606b may include a server client that is used to authenticate the user and/or allocate a user colour to the user.
[00321] In addition, the server client of the common client 606a may authenticate the user requesting to initiate or join the communication session, and the common client, in addition to assigning a different colour to each user, then determines whether the user is requesting: a) a place at the multi-touch platform; b) to join the communication session via a mobile device; and/or c) to join the communication session via another multi-touch platform. The common client 606a also detects the user has a mobile device, e.g. via the mobile client 608 or location client 610, this allows the common client 606a to determine the mobile device address (e.g. IF' address or mobile subscriber number etc.) should the user, at a later stage during the communication session, wish to transfer data items (e.g. files) to their mobile device and work on the data items via the mobile client 608 on their mobile device.
[00322] If it is determined that the user is requesting item a), then the common client 608a, via the server client, may assign a designated location for positioning a user interface for the user around the periphery of the common workspace displayed on the touch screen of the multi-touch platform 606a for the user. Additionally or alternatively, the common client 608a, via the server client, may allow the user to select a location for positioning a user interface for the user around the periphery of the common workspace displayed on the touch screen of the multi-touch platform 606a. In another example, an indicator may show the user their designated location. Should the user also have a mobile device, the common client 606b assigns or registers the mobile device address with the user and the assigned user colour.
[00323] If it is determined that the user is requesting item b), then the common client 608a, via the server client, may assign or register the mobile device address with the user and the assigned user colour. In addition, the common client 606a may also provide a user interface, an indicator or icon representing the user (and or the user colour) for display on the common workspace of the multi-touch platform 606a, which may be used for transferring files to the mobile device of the user (e.g. using the "swish" and/or "drag" touch gestures).
[00324] If it is determined that the user is requesting item c), then the common client 608a, via the server client, may assign or register the other multi-touch platform address and the user colour identifying the user. This may be sent to the other multi-touch platform for assigning a designated location to position the user interface around the periphery of the common workspace displayed on the touch screen of the other multi-touch platform for the user.
Alternatively or additionally, this may be sent to the other multi-touch platform and used in assigning a user selected location to position the user interface around the periphery of the common workspace displayed on the touch screen of the other multi-touch platform. In addition, the common client 606a may also provide on the common workspace of the multi-touch platform 606a a user interface, an indicator or icon representing the user (and or the user colour) for display on the multi-touch platform 606a, which may be used for transferring files to the mobile device of the user (e.g. using the "swish" touch gesture). This user interface, indicator, icon and/or user colour may also be sent to the other multi-touch platform for display on the common workspace of the other multi-touch platform, which may be used for transferring data items (e.g. files) to the mobile device of the user (e.g. using the "swish" touch gesture).
[00325] In step 630, the server client of the common client 606b sends the user colour assigned by the multi-touch platform 606a to the mobile client of the user. In addition, the common client 606a may also provide the mobile client 608 with an indicator or icon representing the user (and or the user colour) for display on the user workspace by the mobile client on the mobile device, which may be used for transferring data items (e.g. files) to the mobile device of the user (e.g. using the "swish" touch gesture). In step 624, the mobile client 608 (e.g. the Xcode app) will display the user colour assigned to the user. As well, all the annotation interactions (including submitting text for collation into a report) by the user are made in the assigned colour. This colour will be displayed on the data items (e.g. documents, files and data) that the user works on in the user workspace of the mobile device and/or in the common workspace on the multi-touch platform etc. [00326] In step 632, the common client 606b displays the user colour assigned to the user on the common workspace displayed on the touch screen of the multi-touch platform. This assigned user colour will be displayed on the data items (e.g. documents, files and data) that the user works on when they are displayed to other users or the user on the multi-touch platform etc. [00327] Figure 6c(i) is a schematic diagram of an example multi-touch platform 606a showing an additional procedure for when users initiate or join a communication session and request a place at the multi-touch platform 606a. In this example, the display device 618 of the multi-touch platform 606a is a touch screen display and displays a common workspace 619 for use by the plurality of users 614a-614f in a communication session at the multi-touch platform 606a. It is noted that the multi-touch platform 606a is configured to allow the plurality of users to simultaneously use the common workspace 619 and associated applications when working on one or more data items. In this example, the users 614a-614f may select a location for positioning their user interface around the periphery of the common workspace 619. The user interface for each user may include a menu interface and/or a user workspace. The common workspace 619 is used by all users to access and work on associated data items (e.g. files and/or data).
[00328] For users 614a-614f connecting to the multi-touch platform 605a, the common client 606b may display a central icon 615 (e.g. a central circular icon) in the common workspace 519 displayed on the touch screen of the multi-touch platform 606a. Although the central icon 615 is shown in figure 6c(i) as a circular icon, it is to be appreciated by the skilled person that any shaped icon may be used. When a user 614a-614f connects to the multi-touch platform 606a, the users 614a-614f in turn can "drag" using a touch screen gesture the central circular icon 615 to a location that may be selected anywhere on the common workspace 619. This informs the multi-touch platform that the users are connected to the table and to allocate their user interfaces for display at the selected location. The multi-touch platform 606a via the common client 605b then recognises users that are connected and located at a place on the multi-touch platform 606a.
[00329] As described with reference to figure 6b, each user is assigned a user colour when they connect to the multi-touch platform 606a or join the communication session remotely to the multi-touch platform 606a, e.g. via a mobile device or another multi-touch platform. As well, each user can be assigned an indicator or icon that may be displayed on the common workspace 619 around the central indicator 615. The user icon indicates who is connected to the multi-touch platform 606a, and/or who has joined the communication session and may be remotely connected to the multi-touch platform 606a, e.g. via a mobile device of another multi-touch platform.
[00330] Additionally, the icons and assigned user colours may be shared with one or more mobile client(s) 608 and/or one or more common client(s) of other multi-touch platform(s) for displaying the user colours and user indicators or icons in a central indicator (e.g. central circular icon), a menu, or other located indicator or icon with information similar to that displayed by the multi-touch platform 606a. Although it is described that users of mobile devices may have a central indicator (e.g. a central circular icon) displayed, it is to be appreciated by the person skilled in the art that, due to the sizes of mobile device touch screens, that the central indicator, menu or other located indicator or icon may be displayed in any position on the touchscreen of the mobile device.
[00331] For example, the first user 614a may initiate a communication session and place data items they wish to be shared and manipulated by the other users into the common workspace 619. Multiple users may simultaneously use the common workspace to work on data items. Users may use a "grab" gesture to retrieve the original, a version or copy of a data item from the common workspace 619 and work on the data item in the common workspace 619. Alternatively or additionally, the users may use their own device(s) and/or common workspaces of another multi-touch platform. Annotations may also be made to data items in the common workspace 619 by one or more of the users. Annotations may also be made to data items via user workspaces on mobile devices, which are displayed in the assigned colours for each user. The users 614a-614f may collaborate together and generate data items (e.g. a report file or finalised file) in the common workspace 619. The files may also be available to all users via a file manager on the user interface of each user.
[00332] Figure 60) is a schematic diagram of an example multi-touch platform 606a showing an additional procedure for when users initiate or join a communication session and request a place at the multi-touch platform 606a. In this example, the display device 618 of the multi-touch platform 606a is a touch screen display and may be divided into a plurality of work zones or work spaces 612a-612f and 613 and/or 619. In this example, there is a user workspace 612a-612f for each designated location (e.g. these may be indicated by indicators, lights, or icons displayed at or near the designated locations). The multi-touch platform 606a may also have a common user work zone or common workspace 613 and/or 619 (shown in dashed lines) on the platform 606a allowing all users to access associated data items (e.g. files and/or data).
[00333] For users 614a-614f connecting to the multi-touch platform 606a, the common client 606b may display a central icon 615 (e.g. a central circular icon) on the touch screen of the multi-touch platform 606a. Although the central icon 615 is shown in figure 6c0i) as a circular icon, it is to be appreciated by the skilled person that any shaped icon may be used. When a user 614a-614f connects to the multi-touch platform 606a, the users 614a-614f in turn can "drag" using a touch screen gesture the central circular icon to their designated workspaces 612a-612f informing the multi-touch platform that the users are connected to the table and in their respective designated locations. The multi-touch platform 606a via the common client 606b then recognises users that are connected and located at a place on the multi-touch platform 606a.
[00334] As described with reference to figure 6b, each user is assigned a user colour when they connect to the multi-touch platform 606a or join the communication session remotely to the multi-touch platform 606a, e.g. via a mobile device or another multi-touch platform. As well, each user can be assigned a user indicator or icon (e.g. Ux icon) that may be displayed in the common workspace 613 and/or 619 around the central indicator 615 (e.g. a central circular icon). The user icon indicates who is connected to the multi-touch platform 606a, and/or who has joined the communication session and may be remotely connected to the multi-touch platform 606a, e.g. via a mobile device of another multi-touch platform.
[00335] Additionally, the icons and assigned user colours may be shared with one or more mobile client(s) 608 and/or one or more common client(s) of other multi-touch platform(s) for displaying the user colours and user indicators or icons in a central indicator (e.g. central circular icon), a menu, or other located indicator or icon with information similar to that displayed in the common workspace 613 and/or 619 on the multi-touch platform 606a. Although it is described that users of mobile devices may have a central indicator (e.g. a central circular icon) displayed, it is to be appreciated by the person skilled in the art that, due to the sizes of mobile device touch screens, that the central indicator, menu or other located indicator or icon may be displayed in any position on the touchscreen of the mobile device.
[00336] For example, the first user 614a may initiate a communication session and place a set of data items (e.g. files) they wish to be shared and manipulated by the other users into the common workspace 613 and/or 619. Users may use a "grab" gesture to retrieve a version or copy of a file from the common workspace 613 and/or 619, for manipulation within their own user workspace 612a-612f or device(s) or user workspaces of another multi-touch platform. Annotations may also be made to data items in the common workspace 613 and/or 619 by one or more of the users. Annotations may also be made to data items in the user workspaces 612a-612f, which are displayed in the assigned colours for each user. The users 614a-614f may collaborate together and generate further data items (e.g. a report file or finalised file) in the common workspace 613 and/or 619. The data items may also be available to all users via a file manager on each of the user workspaces 612a-612f of the multi-touch platform 606a.
[00337] In another example, a second user 614b may also perform a touch screen gesture for transferring a data item to the mobile device of user 614b. This may be performed using the "swish" touch screen gesture for indicating the transfer of the data item to the mobile device of the second user 614b. The "swish" touch screen gesture may be defined by an object of the second user 614b (e.g. a stylus or a finger of the second user 614b) touching the touch screen of the multi-touch platform 606a or a mobile device near where or on the data item that is displayed and then moving the object of the second user 614b across the touch screen in a fast or rapid (e.g. swift "swish" action) movement of the object in the direction associated with where the second user 614b would like the data item to be transferred to, the "swish" gesture ending with the second user 614b swiftly releasing the object from the touch screen of the multi-touch platform in a continuous motion. For example, the direction may be associated with an icon representing another user, mobile device or another multi-touch platform, or may be an indicator representing the designated location of another user at the multi-touch platform 606a etc. [00338] Each user 614a-614f, on initiating and/or joining the communication session, is assigned a user colour and/or user indicator or icon (e.g. Ux icon), which is displayed on the common workspace 613 and/or 619 of the multi-touch platform 606a around the central indicator 615. The user icon indicates who is connected to the multi-touch platform 606a and/or who has joined the communication session either physically at the multi-touch platform 606a, or remotely connected to the communication session via a mobile device or other multi-touch platform. The second user 614b may transfer a data item to their mobile device by using a "swish" gesture directed to their own user icon or indicator on the central indicator 615 and "swish" the data item towards their user icon. The multi-touch platform 606a via the common client 606b, on detecting that the "swish" touch screen gesture is directed towards the icon of the second user 614b, transfers the data item to the mobile device registered by the second user 614b when the second user 614b connected to the multi-touch platform 606a and/or joined the communication session on the multi-touch platform 606a. The second user 614b may then read and work on and/or manipulate the data item using the mobile client 604 on the mobile device.
[00339] The second user 614b, when finished working on the data item, when using the mobile client 604, may use another "swish" touch screen gesture input to the touch screen of the mobile device 106b to transfer the data item back to the multi-touch platform 606a, in which the "swish" touch screen gesture is directed to a multi-touch platform icon or a user icon that is at the multi-touch platform on a corresponding indicator displayed on the mobile device touch screen to the central indicator 615 of the multi-touch platform 606a. In another example, the second user 614b may transfer a data item to another user 614d in the communication session by using the "swish" touch gesture to "swish" the file in the direction of an icon or indictor on the central indicator corresponding to the other user 614d. The multi-touch platform 606a may then transfer the data item to the workspace 612d of the user 614d. If the user 614d is not at the multi-touch platform 606a, but has remotely joined the communication session, the multi-touch platform 606a may transfer the data item to the mobile device of the user 614d or another multi-touch platform where the user 614d may be located.
[00340] Figure 6d is a flow diagram of an example common client 606b of the multi-touch platform 606a showing a method 640 for users initiating or joining a communication session at the multi-touch platform 606a. In this example, the display device of the multi-touch platform 606a is a touch screen display and may be divided into a plurality of work zones or user workspaces (e.g. shaded areas) and a common work zone or common workspace with a central indicator or object displayed in the common workspace on the touch screen of the multi-touch platform 606a. Alternatively, the touch screen display may simply comprise of a common work zone or common workspace with a central indicator or object displayed in the common workspace. For users 614a-614f connecting to the multi-touch platform 606a, the common client 606b may perform the following method: [00341] In step 642, each user may use a "1-finger" touch screen gesture to "drag" the central indicator or object towards the workspace of the user. Alternatively or additionally, when there are no user workspaces or zones, each user may "drag" the central indicator or object to a location within the common workspace. In step 644, the common client 606b detects that a user has "dragged" the central indicator or object into the workspace of the user. This then initiates a login procedure, e.g. using location client 610 or manually within the workspace into which the user "dragged" the central indicator or object of the user. the user logs into the system using login credentials that may include, by way of example only, a login and password combination or any other method for authenticating the user and allowing the user to join or initiate the communication session. In step 646, the common client 606b makes a request for the user to connect to the multi-touch platform 606a based on the login credentials.
[00342] The request may also include a request for a user colour that should be assigned to the user to differentiate the user over other users in the communication session. In step 648, the server client authenticates the request for the user to connect to the multi-touch platform 606a based on the login credentials. As well, the server client allocates and assigns a user colour (e.g. Ux colour) to the user. In step 650 the user colour is then displayed as an icon or the touch screen displays the user colour via an icon in the central indicator or object. As well, the work the user performs on any file etc. is displayed in the user colour. In step 652, when one or more users have joined the communication session, the common client 606b then proceeds to allow the users to interact and/or work on the files associated with the communication session as described herein.
[00343] Figure 7 is a schematic diagram showing another example "multi-client multi-server" multi-user information sharing system 700 showing the interactions between the components of the system. As previously described, the multi-user information sharing system 700 may include one or more multi-touch platforms connected to one or more servers that are configured to provide access to a plurality of users and one or more of their mobile devices for initiating and/or joining a communication session to work on a plurality of data items (e.g. files) together. Each server includes an uploader client 704, each multi-touch platform includes a common client 706, and each mobile device includes a mobile client 708 and/or location client 710.
[00344] The common client 706 (e.g. multi-touch platform application and/or interface 212b) is a software component installed on the multi-touch platform. For example, the common client 706 may be stored in a computer readable medium of the multi-touch platform as software, code or computer instructions, which when executed on a processor associated with the multi-touch platform, causes the processor to implement one or more functions of the multi-user information sharing system 700.
[00345] The common client 700 may configures the multi-touch platform to allow a user to initiate a communication session, and other users to join the communication session, upload data items (e.g. data and/or files) to be displayed and worked on by users of the communication session using a common workspace displayed by the display device of the multi-touch platform. For example, users may work on files via the common workspace displayed by display device of the multi-touch platform for those users placed around the multi-touch platform, or via user workspaces displayed on display devices (e.g. touch screens) of mobile devices for users remote to the multi-touch platform or via common workspaces displayed by display devices of other multi-touch platforms for users placed around those. The common client 706 may implement the functionalities and/or methods described herein for the multi-touch platforms as described with reference to any of figures la-10.
[00346] In the example system 700 of figure 7, the common client 706 includes a server client 706a for interacting with corresponding clients 704, 708 and 710 of servers and user devices and operating the multi-touch platform. The server client 706a is coupled to, by way of example only but not limited to, a gesture library 706b, style sheet library 706c, document display function 706d, report generation function 706e, third party application integration system 706f, near-far communication system 706g and/or optionally a lighting system 706h.
[00347] The gesture library 706b includes data representative of one or more touch screen gestures (e.g. "drag", "grab", "1-finger", "swish", "swipe", etc.) for use by the server client 706a for recognising and processing user touch gestures on a common workspace and/or user workspace, where applicable, displayed on the touch screen of the multi-touch platform and performing the appropriate action/function. The style sheet library 706c includes data representative of one or more style sheets for formatting and displaying layout of common workspace, workspaces of the users (if any), and/or data items (e.g. documents, data and/or files) displayed on the touch screen display of the multi-touch platform. Document display function 706d is configured for displaying data items (e.g. documents, data and/or files) accordingly. Report generation function 706e is configured for finalising work and/or changes performed by users on data items (e.g. reports, finalised filed, data or files etc.), the finalised work may be placed in the common workspace of the multi-touch plafform to allow all users to verify and finalise the report or finalised document.
[00348] The third party application integration system 706f is configured to integrate third party applications into the common workspaces and/or user workspaces Of any) and display functions 706d of the multi-touch table allowing the users to operate the third party applications for working on and/or displaying data items (e.g. data, documents, files etc.) Third party applications may include, by way of example only but are not limited to, video players, audio players, graphics and/or drawing programs, word processors, document viewers and annotators, or any other application that users may require during a communication session on the multi-touch platform.
[00349] Users may initiate or join a communication session using near-far communication reader function 706g that interacts with locator client(s) 710 of mobile devices or wearable devices for login to the multi-touch platform. This may be based on location of the user to the platform, where the user may be automatically logged into and authenticated by the server client 706a of the multi-touch platform. As an option, the lighting function 706h is configured for indicating status of the multi-touch platform and/or user placement or place settings of users around the multi-touch platform.
[00360] The uploader client 704 (e.g. server multi-touch platform application and/or interface 308b) is a software component that may be installed on a shared and/or secure server. The server may further be implemented as a cloud based server (e.g. a centralised and/or distributed server system comprising one or more servers) and/or as a client centralised server system/network. As an example, the uploader client 704 may be stored in a computer readable medium of the server as software, code or computer instructions, which when executed on a processor or processor(s) associated with the server, causes the processor or processor(s) to implement one or more functions of the multi-user information sharing system 700. For example, the uploader client 704 configures the server to allow data items (e.g. data/files) to be uploaded to/from the common client 706 of a multi-touch platform, server or to/from one or more mobile device(s) connected to the server or multi-touch platform (e.g. connected via the Internet or any other communication network to the shared server). The uploader client 704 may implement the functionalities and/or methods of a server or server apparatus as described herein and/or as described with reference to any of figures la-10.
[00351] In this example, the uploader client 704 includes a server system function 704a (e.g. HTML5 app) for controlling interaction, connection to, and/or transfer of data items (e.g. files/data) to/from the common client 706 of the multi-touch platform via the server client 706a.
The server system function 704a is coupled to the connection function 704b, file transfer function 704c, and display data function 704d. Connection function 704b interacts with the server client 706a for authenticating and logging in users to a communication session of the multi-touch platform and for controlling the transfer of and upload of data items (e.g. files/data) to/from the multi-touch platform and other mobile devices and multi-touch platforms etc. [00352] The file transfer function 704c is used for selecting and sending one or more data items (e.g. files, data, and/or documents) to the multi-touch platform once a communication session has been initiated. For example, the user initiating the communication session may have a set of data items (e.g. a plurality of files) uploaded to the multi-touch platform for working on by users during the communication session. In another example, other users joining the communication session may also upload one or more data items (e.g. files and/data) to the multi-touch platform for use and or working on during the communication session. The display data function 704d may be used to transfer display information from the multi-touch platform to other mobile devices and/or other multi-touch platforms to control the display of data items (e.g. data, files, documents and the like), icons, indicators created, used, updated and maintained during the communication session.
[00353] The mobile client 708 (e.g. mobile multi-touch platform application and/or interface 410b) is a software component that may be installed on a touch based mobile device. For example, the mobile client 708 may be stored in a computer readable medium of the mobile device as software, code or computer instructions, which when executed on a processor or processor(s) associated with the mobile device, causes the processor or processor(s) to implement one or more functions of mobile devices in the multi-user information sharing system 700. For example, the mobile client 708 may configure the mobile device to allow data items (e.g. data and/files) (and annotations) to be transferred, using "swipe" or "swish" type touch gestures, to/from the common client 706 of the multi-touch platform during a communication session.
[00354] The mobile client 708 may also configure the mobile device to allow data items (e.g. data and/or files) to be worked on by the user of the mobile client 708, and to display data items (e.g. data and/or files) presented by other users during a communication session. The mobile client 708 may also be used to operate the mobile device and for interacting with the multi-touch platform (e.g. via common client 705) and/or server apparatus (e.g. via uploader client 704) and other mobile devices and other multi-touch platforms etc. The mobile client 708 may implement the funcfionalifies and/or methods of mobile devices as described herein and/or as described with reference to any of figures la-10.
[00355] In this example, the mobile client 708 includes a client application (e.g. an Xcode app) 708a for controlling interaction, connection to, and/or transfer of data items (e.g. files/data) to/from the common client 706 of the multi-touch platform and mobile device. The client application 708a may also control the operation of the mobile client 708 and mobile device during a communication session. The client application 708a is coupled to the connection function 708b, file transfer function 708c, and display data function 708d, and memory unit or storage unit 708e for data items (e.g. data, documents, and/or files etc.) Connection function 708b interacts with the server client 708a for authenticating and logging in users to a communication session of the multi-touch platform and for controlling the transfer of and upload of data items (e.g. files/data) to/from the multi-touch plafform and other mobile devices and multi-touch platforms etc. [00356] The file transfer function 708c is used for selecting and sending one or more data items (e.g. files, data, and/or documents) to the multi-touch platform once a communication session has been initiated. For example, the user of the mobile device may select and transfer a data item (e.g. one or more files, data and/or documents) to another user or multi-touch platform. This may be performed by, for example, selecting one or more data item(s) with a "1-finger touch" or a selection function in combination with a "swish" type touch gesture in the direction of the another user or multi-touch platform indicated on the touch screen of the mobile device to transfer the selected data item(s) etc., to that other user or multi-touch platform. Similarly, a user of a multi-touch platform or other mobile device may select a data item (e.g. or one or more files, data or documents etc.) for transfer to the mobile device. This may be performed by, for example, by another user selecting one or more data item(s) with a "1-finger touch" or a selection function in combination with a "swish" type touch gesture in the direction of the user of the mobile device indicated to the other user on the corresponding touch screen of the multi-touch platform or other mobile device to transfer the selected data item(s) etc., to the user of the mobile device. The display data function 708d may be used to for receiving display information from the multi-touch platform, server, or other mobile devices to control the display of date items (e.g. data, files, documents), icons, indicators created, used, updated and maintained during the communication session.
[00357] The locator client 710 is a software component that may be installed on a mobile device or wearable device etc. For example, the locator client 710 may be stored in a computer readable medium of the mobile device or wearable device as software, code or computer instructions, which when executed on a processor or processor(s) associated with the mobile device or wearable, causes the processor or processor(s) to implement one or more functions of for logging the user into the multi-user information sharing system 700 for initiating or joining a communication session.
[00358] The locator client 710 may configure the mobile device or wearable to automatically log the user of the mobile device into a multi-touch platform. The locator client 710 may manage the user IDs, their locations, and upload the main user preferences to the multi-touch platform. The locator client 710 may also be configured to initiate the uploader client 704, e.g. via the server client 706a of the common client 706, to automatically upload the main user documents, date items (e.g. files and/or data) when a user initiates or joins a communication session. The locator client 710 may implement the functionalifies and/or methods of mobile devices and/or wearable devices logging users into the multi-touch platform as described herein and/or as described with reference to any of figures la-10.
[00359] In this example, the locator client 710 may include a client application 710a (e.g. wearable application for a wearable device) for controlling when the user is near to a multi-touch platform for logging the user into the multi-touch platform. The client application 710a is coupled to a near-far communication system (e.g. NFC sender) for interacting with the NCF system, 706g of the common client 706a of the multi-touch platform to log and connect the user into the multi-touch platform for initiating or joining a communication session.
[00360] Figures 8a(i) and (ii) are schematic diagrams showing side elevation and plan elevation of an adjustable multi-touch platform 800. Figure 8a(i) shows the side elevation of the adjustable multi-touch platform 800 when the adjustable multi-touch platform 800 is viewed in the direction of arrow A indicated in the plan elevation of figure 8a(ii). Referring to both Figures 8a(i) and (ii), the adjustable multi-touch platform 800 includes a multi-touch platform 102a with a multi-touch platform table top 103 including a display device 108 such as a touch screen display that displays a common workspace 109. The multi-touch platform table top 103 is mounted to a base 802 that includes an adjustable mechanism for translating the multi-touch platform table top 109 and hence the display device 108 from a first position (e.g. Fl) to a second position (e.g. P2) and/or for "rotating" the multi-touch platform table top 109 and hence the display device 108 around pivot axis 804 from the second position (e.g. P2) to a third or fourth position (e.g. P3 or P4) (e.g. a flip and height adjustable mechanism).
The adjustable mechanism may be motorised to assist in the adjustment of the multi-touch platform 102a.
[00361] For example, the multi-touch platform 102a further includes an adjustable mechanism for rotating the display device 108 such as a touch screen associated with the multi-touch platform 102a around a pivot axis 804 in the plane of the multi-touch platform table top 109 and hence the plane of the display device 108 touch screen from a first position to a second position. The platform 102a may detect an user input from a user to move the display device 108 associated with the multi-touch platform 102a from the first position to the second position, and the adjustable mechanism is activated to rotate the display device 108 from the first position to the second position around the axis 804 by rotating the multi-touch platform table top 103.
[00362] Using the adjustable mechanism, the multi-touch platform (e.g. the table top or table surface 103 of the multi-touch platform) is height adjustable (e.g. from P1 to P2) to allow sitting and standing user positions 814b and 814a, respectively. Additionally or alternatively, the multi-touch platform table surface 103 can be tilted ("flipped" or rotated) along pivot axis 804 (e.g. up to 90 degrees) in the plane of the multi-touch platform table top 103 from a first or second positions to a third position or a fourth position (e.g. from P1/P2 to P3 or P4). This allows users 814a-814c to work and display information in various modes, such as, by way of example but not limited to, a table top mode (e.g. positions P1 and/or P2), a console mode (e.g. position P3), and/or a wall-mounted display mode (e.g. P4). As an option, all or part of the multi-touch platform table top 103 can be tilted (lipped" or rotated) as required by the user(s). For example, seated users 814b may want to work on a horizontal component of the multi-touch platform table top 103 while simultaneously presenting their work to standing up users 814a or 814c.
[00363] Figures 8b(i) and (ii) are perspective diagrams showing another example adjustable multi-touch platform 820 with a multi-touch platform 102a mounted on base 802. Referring to both figures 80) and (ii), in this example, the pivot axis 804 in the plane of the multi-touch platform table top 103 (e.g. 103a and 103b) and hence the display device 108 (e.g. 108a and 108b) divides both the multi-touch platform table top 103 and the display device 108 into a first portion 103a and 108a and a second portion 103b and 108b, The adjustable mechanism for rotating the multi-touch platform table top 103 rotates the first portion 103a of the multi-touch platform table top 103 from a first position (e.g. horizontal position as shown in figure 80)) to a second position (or third or fourth position) (e.g. a tilted or vertical position as shown in figure 800) around the pivot axis 804. The second portion of the display device 108b remains in the first position (e.g. a horizontal position as shown in figure 8b(ii)). This means that only the first portion of the display device 108a is rotated from the first position to the second position (or third or fourth position) (e.g. tilted position or vertical position) around the pivot axis 804. This enables a user to view and/or work on the first portion of the display device 108a in the second position (or third or fourth position) (e.g. tilted position or vertical position) while at the same time viewing and/or working on the second portion of the display device 108b in the first position (e.g. a horizontal position).
[00364] Figure Sc is a schematic diagram showing a control mechanism 804 for the adjustable mechanism 802 of the adjustable multi-touch platform 800 or 820 of figures 8a(i) and (ii) and/or 8b(i) and (ii). The adjustable mechanism 802 (e.g. a motorised FHM) includes a control mechanism 804 that includes one or more switches 806 (e.g. an external finger switch, a push button panel, or even a switch panel displayed on display device 108a), input/output device 808, flip mechanism 810, and height mechanism 812. The input/output device 808 is connected to the switch(es) 806, the flip mechanism 810, and height mechanism 812.
[00365] In operation, the control mechanism 804 is operated by a user selecting/pressing/switching one or more of the switch(es) (e.g. a side mounted push button panel), which is detected and decoded by the input/output device 808. Depending on the switch(es) selected/pressed/switched, the I/O device 808 instructs the flip mechanism 810 and/or the height mechanism 812 to adjust the tilt (e.g. "flip" or rotation) of the multi-touch platform 800 or 820 and/or the translation (e.g. height) of the multi-touch platform 800 or 820 to that desired by the user.
[00366] Figures 9 and 10 are schematic diagrams of further example multi-user information sharing systems 900 and 1000. Referring to Figure 9 the multi-user information sharing system 900 includes a multi-touch platform 902 and 904 in communication with a mobile device 906 with a touch screen display. The multi-touch platform 902 includes a multi-touch platform display device 908 for displaying a common workspace. The multi-touch platform 902 may be split into two separate components or devices, a multi-touch platform 902 with display device 908 and a multi-touch platform component 904 that implements the functionality or processing power of the multi-touch platforms as described herein with reference to any of figures la-8b. The system 900 is reduced in size compared with other multi-touch platforms, such as multi-touch platforms 102a-102b of figures la-1g. As there is less on screen "real estate" on the display device 908, users may initiate or join a communication session by communicating via the multi-touch platform component 904 with their mobile devices 904 and use their mobile devices 904 during a communication session.
The multi-touch platform 902 of figure 9 is a so-called "mini version" that is less than a full size multi-touch platform, e.g. no bigger than a mobile table. The advantage of such a sized multi-touch platform 902 is portability, mobility, light-weight and may be battery powered.
[00367] Referring to Figure 10 the multi-user information sharing system 1000 includes a multi-touch platform 1002 and 1004 in communication with a mobile device 1006 with a touch screen display. The system 1000 is reduced in size compared with other multi-touch platforms, such as multi-touch platforms 102a-102b of figures la-1g. The multi-touch platform 1002 of figure 10 may be a "micro version" that is less than a quarter of the size of the "mini version" multi-touch platform 900, e.g. no bigger than a quarter of mobile table where the common touch screen is replaced by user indication lights 1008 (e.g. i1-i6). This is due to the size of the multi-touch platform 1002.
[00368] Given there is no common touch screen, the multi-touch platform 1002 may be split into two separate components or devices, a multi-touch platform 1002 with only the indicator lights 1008 indicating where users may sit, and a multi-touch platform component 1004 that implements the functionality or processing of the multi-touch platforms as described herein with reference to any of figures la-8b. Instead, users communicate and initiate and/or join communication sessions with the multi-touch platform 1000 using mobile devices 1006. U The advantage of such a smaller sized multi-touch platform 1002 is portability, mobility, light-weight and may be battery powered.
[00369] Using near-field communications, the main user that initiates a communication session can be allowed to log-into the system 900 or 1000 and upload their user preferences and data/files for sharing with other users that may join the communication session. The main-user may also log-off either manually via a common interface or by placing their wearable out of range of the system 900 or 1000 (e.g. drop-out of range automatically logs-out of the system 900 or 1000). Further users may do the same when joining or leaving a communication session.
[00370] Although the multi-user information sharing systems, multi-touch platforms, servers and mobile devices have been described, by way of example, with reference to figures la-10, it is to be appreciated by the skilled person that further modifications and variations of these systems, apparatus, and devices may be contemplated. For example, the multi-touch platform(s) as described with reference to figures la-10 may be divided into separate components or systems depending on the type of communications sessions users may wish to have.
[00371] For example, a mobile multi-user information sharing system may include a screen-less multi-touch platform, that is a multi-touch platform without a display device or touch screen, which operates with touch-based mobile devices, where each user initiates and joins a communication session, and works on the uploaded set of data items (e.g. a plurality of files/data) via their mobile devices. In addition, the screen-less multi-touch platform may further include the server apparatus or other computing apparatus computing apparatus or device (e.g. a desktop server like device) that allows data to be directly uploaded and/or "swiped" or "swished" to other user devices for the communication session. The multi-touch platform may implement the multi-touch platform common client and also the server uploader client functions as described with reference to figures 2a-7.
[00372] For example, a mobile multi-user information sharing system may include a screen-less multi-touch platform, that is a multi-touch platform without a display device or touch screen, which operates with touch-based mobile devices, where each user initiates and joins a communication session, and works on the uploaded data items (e.g. a plurality of files/data) via their mobile devices. The screen-less multi-touch platform may be implemented by a server apparatus or other computing apparatus or device (e.g. a desktop server like device), and allows data to be directly swiped to other user devices.
[00373] In another example, the functionality of a multi-touch platform as described with reference to figures la-10 may be incorporated into a touch based mobile device, whereby one or more users initiate and join a communication session hosted by the touch based mobile device, in which these users are placed around the touch based mobile device. Other users can join the communication session via their mobile devices. That is, one of the user's mobile devices executes a multi-touch platform common client as described with reference to Figures 2-7, while the other user mobile devices execute a mobile client for joining the communication session controlled by the multi-touch platform common client.
[00374] It is apparent that the multi-touch platforms as described herein with reference to any of figure la-10 provide further advantages over existing multi-touch devices used for communicating in boardrooms and meeting rooms that further inspires, motivates and improve team performance. For example, the multi-user information sharing systems as described with reference to any of figures 1a-10 provide the advantage of enhanced connectivity of multi-touch-platforms (e.g. multi-touch-table) to mobile and wearable devices via local area network and/or near-field communication protocols. The connectivity allows for the platform to: a) identify number of users; b) location of users physically around the platform; c) recognise main user preferences; d) allow users to transfer data to and from a platform.
[00376] Other advantages provided by the multi-user sharing information system, multi-touch platform, and/or mobile devices as described herein include the "swish" or "swipe" touch screen gesture (e.g. "swish" or "swipe" finger gesture), which provides a fast and efficient method of transferring data or files between users and/or the multi-touch platform and mobile devices during a communication session. This results in a more natural method of handing one or more files between users.
[00376] Users are able to transfer data via a swish or swipe finger gesture (e.g. they can "swipe" or "swish" data items (e.g. files)) to and from a common multi-touch-platform from other multiple touch interfaces (e.g. mobile device, wearables, touch-tables). As well, once the data is presented ("swished" or "swiped") to the multi-touch-platform, other gestures can be used by a user to "grab" and orient the data item (e.g. file) for reading (i.e. like a paper folder bring passed around a board/meeting table). Users can also "swish" or "swipe" the data item to their mobile device for working on in their private space (e.g. so-called "pull-back" of files to the mobile device), and then "swish" or "swipe" any worked on documents back to the multi-touch-platform.
[00377] Other touch screen gestures can be defined for providing efficient manipulation of data items (e.g. files) in the user workspaces, for example a "Grab & twist" touch screen gesture (finger gesture) may be used within the workspaces or common workspaces (e.g. graphical user interface(s)) of the multi-touch platform or mobile devices in which data is moved into the work space and/or presented in an orientation that is readable to one or more users or even multiple users.
[00378] Some further the advantages provided by the present invention include viewing plane adjustment, where the multi-touch-platform can be flipped or rotated through varying degrees of rotation (e.g. from horizontal through to vertical positions), height adjustment, where the multi-touch platform can be used with users in a sitting, standing, or both standing and sitting positions. In addition, having a variable viewing place provides the advantage of enhancing the way of presenting information in a collaborative fashion to both seated and standing audiences and/or users using the same device.
[00379] Further advantages provided by the present invention include allowing users to interact with the platform via their mobile and/or wearable devices, allowing users an interface where they can freely orientate information for their own personal use without disturbing others, allowing users to view information either horizontally, tilted, or vertically.
[00380] Other variations or modifications may further include the multi-touch platform may be implemented using a non-rectangular "wrap-around" structure that is customisable based on multi-touch platform dimensions, shape and orientation. The flip mechanism for converting a multi-touch-platform from horizontal to vertical position. The mechanism as described herein allows for height adjustability, where the platform can be used with sitting or standing.
[00381] The present invention provides a multi-user information sharing system that includes a multi-touch interface and/or touch screen mobile devices (e.g. multi-touch platforms and/or mobile devices) that display a common workspace for creating a collaborative working environment (e.g. communication session) that can also be connected to multiple users with touch-based mobile interfaces. The multi-touch-interface displays a common workspace that is a common platform for sharing, examining, and editing digital information (e.g. data items).
It also serves as a platform to send data items (e.g. files/documents and/or other data) to connected mobile interfaces. It is a common platform for displaying data sent from connected mobile devices. The system also allows for working data items (e.g. documents or files) to be sent to a common screen (e.g. the common workspace) and/or, where applicable, individual screens (e.g. user workspaces of a multi-touch platform or touch screens of mobile devices).
[00382] The term 'computer' is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realise that such processing capabilities are incorporated into many different devices and therefore the term 'computer' includes PCs, servers, network nodes and other network elements, multi-touch platforms, mobile telephones, personal digital assistants and many other devices.
[00383] Those skilled in the art will realise that storage devices utilised to store program instructions can be distributed across a network. For example, a remote computer may store an example of the process described as software. A local or terminal computer may access the remote computer and download a part or all of the software to run the program.
Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realise that by utilising conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a DSP, programmable logic array, or the like.
[00384] Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person. It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages.
[00385] Any reference to 'an' item refers to one or more of those items. The term 'comprising' is used herein to mean including the method blocks or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
[00386] The steps of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the spirit and scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
[00387] Although embodiments of the present invention have been illustrated in conjunction with the accompanying drawings and described in the foregoing detailed description, it should be appreciated that the invention is not limited to the embodiments disclosed. Therefore, the present invention should be understood to be capable of numerous rearrangements, modifications, alternatives and substitutions without departing from the spirit of the invention as set forth and recited by the following claims.
[00388] It will be understood that the above description of a preferred embodiment is given by way of example only and that various modifications may be made by those skilled in the art. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this invention.

Claims (53)

  1. Claims 1. A method for a plurality of users to collaborate on a set of data items in a communication session of a multi-touch platform, the multi-touch platform comprising a touch screen configured to display a common workspace for use by multiple users at the multi-touch platform, the method, performed by the platform, comprising: displaying, for each of the users at the multi-touch platform, a user interface in the common workspace, the user interface indicating where each user is positioned at the platform and comprising selectable options for each user to access and work on at least one of the data items; receiving, by the touch screen, one or more touch screen gestures in the common workspace from users positioned at the multi-touch platform, wherein the one or more touch screen gestures are associated with performing one or more operations on at least one of the data items; processing each of the one or more data items according to the operations of each associated received touch screen gesture; and displaying the data items in the common workspace based on the corresponding touch screen gesture, wherein the one or more users use the common workspace to perform further work on the set of data items.
  2. 2. A method as claimed in claim 1, further comprising, in response to a corresponding touch screen gesture associated with one or more data items, transferring one or more data items from the multi-touch platform to a device associated with a user of the plurality of users.
  3. 3. A method as claimed in claims 1 or 2, further comprising receiving, by the multi-touch platform from a device associated with one of the plurality of users, one or more further data items for the one or more users to work on in the common workspace.
  4. 4. A method as claimed in any of claims 1 to 3, wherein the plurality of users comprises one or more local users positioned at the multi-touch platform and one or more remote users remotely accessing the communication session via one or more devices.
  5. A method as claimed in any of claims 2 to 4, wherein the device associated with the user is a mobile device of the user, wherein the mobile device is configured to access the communication session and comprises a mobile touch screen configured for displaying a user workspace associated with the communication session, the user workspace configured for the user to perform one or more further operations on one or more of the data items.
  6. 6 A method as claimed in any of claims 2 to 4, wherein the device associated with the user is a further multi-touch platform associated with the user, the further multi-touch platform including a touch screen configured to display a common workspace and a user interface associated with that user in the common workspace of the further multi-touch platform.
  7. 7 A method as claimed in any of claims 1 to 6, wherein at least one of the selectable options of the user interface is for executing an application for accessing at least one of the data items; detecting one or more users selecting said at least one selectable option and executing the application associated with said at least one selectable option in the common workspace.
  8. 8 A method as claimed in any of claims 1 to 7, displaying, for each of the users that are remote to the multi-touch platform, a user interface in the common workspace, the user interface for the remote user indicating the status of the remote user.
  9. 9. A method as claimed in any one of claims 1 to 8, further comprising: detecting, on the touch screen associated with the multi-touch platform, one of the touch screen gestures to be a touch screen gesture for transferring one or more of the data items from a first user of the multiple users to a second user of the multiple users; determining a direction of the detected touch screen gesture; identifying the direction of the touch screen gesture is in a direction associated with the second user; and transferring the one or more data items to the second user during the communication session.
  10. A method as claimed in claim 9, wherein detecting the touch screen gesture for transferring the one or more data items further comprises detecting the first user touching the touch screen with an object of the first user in the vicinity of said one or more data items and, prior to the first user removing the object from the touch screen, detecting the object moving in a direction associated with the second user.
  11. 11. A method as claimed in claim 9 or 10, wherein determining the direction associated with the second user comprises, determining, from the touch screen gesture, a vector comprising a direction pointing towards the second user, wherein the second user is located at the multi-touch platform.
  12. 12. A method as claimed in any of claims 9 to 11, wherein determining the direction associated with the second user further comprises determining the direction of the touch screen gesture is towards an indicator associated with the second user.
  13. 13. A method as claimed in any of claims 9 to 12 when dependent on claim 8, wherein the direction associated with the second user is based on releasing the object from the touch screen when the object is in the vicinity of an indicator associated with the second user.
  14. 14. A method as claimed in claim 9 or 13, wherein the object intersects the indicator associated with the second user.
  15. 15. A method as claimed in any of claims 11 to 14, wherein the indicator associated with the second user is a user interface on the touch screen representing the second user.
  16. 16 A method as claimed in any of claims 9 to 15, the multi-touch platform comprising a communication interface for communicating with a device associated with the second user, wherein transferring the one or more data items to the second user further comprises transferring the one or more data items via the communication interface to the device associated with the second user.
  17. 17. A method as claimed in claim 16, wherein the first user is also the second user with a device external to the multi-touch platform, the method comprising: determining the direction of the detected touch screen gesture comprises determining the direction of the detected touch screen gesture is a direction associated with the first user; and transferring the one or more data items further comprises transferring the one or more data items to the device of the first user.
  18. 18. A method as claimed in any of claims 1 to 17, further comprising receiving a request from a user to join the communication session, authenticating the user prior to joining the communication session, and, on authenticating the user, allocating and displaying a user interface associated with the user on the common workspace.
  19. 19. A method as claimed in claim 18, wherein the touch screen is further configured to display a common graphical object, wherein authenticating further comprises: detecting a user using a touch screen gesture to move the common graphical object to a position on the common workspace; authenticating the user via said common workspace; allocating a user interface to the authenticated user for the communication session; and displaying the user interface in the common workspace.
  20. 20. A method as claimed in claims 18 or 19, wherein the multi-touch platform comprises a plurality of designated locations spaced around the periphery of the multi-touch platform, the method further comprising: determining, from the request, that the authenticated user requests to be placed at a designated location, allocating one of the designated locations to the authenticated user, and displaying the user interface at a position in the common workspace in the vicinity of the designated location.
  21. 21. A method as claimed in claims 18 to 20, the method further comprising: determining, from the request, that the authenticated user selects a location by moving the common graphical object to the position in the common workspace, allocating a user interface to the authenticated user, and displaying the user interface at the vicinity of the selected position in the common workspace.
  22. 22. A method as claimed in any of claims 18 to 21, further comprising: determining the user requests to join the communication session via a mobile device external to the multi-touch platform; associating the mobile device address with the authenticated user and the user interlace associated with the authenticated user; and displaying the user interface for the user of the mobile device in the common workspace, wherein the user interface for the user of the mobile device indicates the status of the remote user.
  23. 23. A method as claimed in any of claims 16 to 19, further comprising: determining the user requests to join the communication session via another multi-touch platform; associating the another multi-touch platform address with the authenticated user and the user interface associated with the user; and displaying the user interface in the common workspace, wherein the user interface for the user indicates the status of the users at the other multi-touch platform.
  24. 24 A method as claimed in any preceding claim, further comprising: initiating the communication session on request by a first user of the plurality of users for starting the collaboration using the multi-touch platform; and providing users of the communication session access to the set of data items via the multi-touch platform, wherein the set of data items are associated with the first user.
  25. 25. A method as claimed in any preceding claim, wherein the multi-touch platform is configured to wirelessly communicate with one or more devices.
  26. 26. A method as claimed in any preceding claim, wherein at least one of the plurality of users has a mobile device, the method further comprising: detecting that the mobile device of said at least one user is in the presence of the multi-touch platform; connecting to the mobile device of said at least one user for authenticating said at least one user and to log the at least one user into the multi-touch platform for joining the communication session.
  27. 27. A method as claimed in claim 26, wherein the mobile device is a wearable device and the multi-touch platform wirelessly communicates with the mobile device using near far communication signals.
  28. 28. A method as claimed in any preceding claim, wherein the multi-touch platform further comprises a mechanism for rotating the touch screen associated with the multi-touch platform around an axis in the plane of the touch screen from a first position to a second position, the method further comprising: detecting an input from a user to move the touch screen associated with the multi-touch platform from the first position to the second position; rotating the touch screen from the first position to the second position around the axis.
  29. 29. A method as claimed in claim 28, wherein the axis in the plane of the touch screen divides the touch screen into a first and a second portion, wherein the mechanism for rotating the touch screen associated with the multi-touch platform rotates the first portion of the touch screen from a first position to a second position around the axis.
  30. A method as claimed in any preceding claim, further comprising: detecting whether multiple users of the plurality of users are to work on a data item of the set of data items substantially simultaneously; providing each of the multiple users with a copy of the data item receiving one or more of the copies of the data items worked on by the multiple users; and determining the changes between each copy of the data item with the original data item; generating a temporary data item by merging the original data item with the changes made to the one or more copies of the data item; and verifying the changes are acceptable to the multiple users prior to finalising the temporary data item.
  31. 31. A method as claimed in any preceding claim, further comprising: detecting whether multiple users of the plurality of users are to work on a data item of the set of data items substantially simultaneously; allowing each of the multiple users to work on the data item serially; and verifying the changes are acceptable to the multiple users prior to finalising the data item.
  32. 32. A multi-touch platform comprising a processor, a touch screen, a communications interface, a memory, the processor is connected to the a touch screen, communications interface, and the memory, wherein the memory comprises computer instructions stored thereon, which when executed by the processor, causes the processor to perform the method according to any one of claims 1 to 31.
  33. 33. A multi-touch platform according to claim 32, wherein the touch screen is embedded in a table top.
  34. 34. A multi-touch platform according to claim 33, further comprising a plurality of designated locations spaced around the periphery of the table top for placing one or more users" wherein each of the plurality of designated locations is identified by an indicator.
  35. 35. A multi-touch platform according to claims 33 or 34, wherein the table top from a plan view comprises a rectangular shape with chamfered corners.
  36. 36. A computer readable medium comprising computer instructions stored thereon, which when executed by a processor, causes the processor to perform the method according to any one of claims 1 to 31.
  37. 37. A method for operating a server apparatus in communication with a multi-touch platform according to any one of claims 32 to 35, wherein the server apparatus comprises storage for storing the set of data items.
  38. 38. A method as claimed in claim 37, further comprising transferring the one or more data items from a first user to a second user during the communication session.
  39. 39. A method as claimed in claim 38, wherein the second user is one of the users with a mobile device external to the multi-touch platform, and transferring the one or more data items to the second user further comprises transferring the one or more data items to the mobile device of the second user.
  40. A method as claimed in any of claims 38 or 39, wherein the first user is one of the users with a mobile device external to the multi-touch platform, the method comprising transferring the one or more data items further comprises transferring the one or more data items to the mobile device of the first user.
  41. 41. A method as claimed in any of claims 37 to 40, further comprising authenticating one or more users of the plurality of users requesting to join the communication session.
  42. 42. A method as claimed in any of claims 37 to 41, further comprising: receiving a request to upload the set of data items to the multi-touch platform for initiating the communication session on request by a first user of the plurality of users; and uploading to the multi-touch platform the set of data items, wherein the set of data items are associated with the first user.
  43. 43. A server comprising a processor, a communications interface, a memory, the processor is connected to the communications interface, and the memory, wherein the memory comprises computer instructions stored thereon, which when executed by the processor, causes the processor to perform the method according to any one of claims 37 to 42.
  44. 44. A computer readable medium comprising computer instructions stored thereon, which when executed by a processor, causes the processor to perform the method according to any one of claims 37 to 42.
  45. 45. A method for a user with a mobile device to collaborate with a plurality of users on a set of data items in a communication session with a multi-touch platform according to claims 32 to 35, the mobile device comprising a touch screen configured to display a user workspace and a communication interface for communicating with the multi-touch platform, the method, performed by the mobile device, comprising: detecting a touch screen gesture on the user workspace associated with one or more of the data item from the user, wherein the touch screen gesture is for transferring said one or more data items from the user to one of the plurality of users or the multi-touch platform; determining a direction of the detected touch screen gesture; identifying the direction of the touch screen gesture is associated with one of the plurality of users or the multi-touch platform; and transferring the one or more data items from the mobile device to one of the plurality of users or the multi-touch platform during the communication session.
  46. 46 A method as claimed in claim 45, wherein detecting the touch screen gesture for transferring one or more data items further comprises detecting the first user touching the touch screen with an object of the first user in the vicinity of said one or more data items and, prior to the first user releasing the object from the touch screen, detecting the object moving in a direction associated with one of the plurality of users or the multi-touch platform.
  47. 47. A method as claimed in claims 45 or 46, further comprising sending a request to the multi-touch platform requesting the user joins the communication session.
  48. 48. A method as claimed in any of claims 45 to 47, the method further comprising: notifying the multi-touch platform that the user of the mobile device is in the presence of the multi-touch platform; connecting to the multi-touch platform for authenticating the user of the mobile device and to log the user into the multi-touch platform to join the communication session.
  49. 49. A method as claimed in any of claims 45 to 48, wherein the mobile device wirelessly communicates with the mobile device using near far communications signals.
  50. 50. A method as claimed in any of claims 45 to 49, further comprising: receiving a copy of a data item in response to the multi-touch platform detecting whether multiple users of the plurality of users are to work on a data item of the plurality of data items substantially simultaneously; and transmitting the copy of the data item worked on by the user of the mobile device to the multi-touch platform for generating a data item comprising the merged content of the original data item and the copy of the data item worked on by the multiple users.
  51. 51. A method as claimed in any of claims 45 to 50, further comprising: receiving a copy of a data item in response to the multi-touch platform detecting whether multiple users of the plurality of users are to work on a data item of the plurality of data items substantially simultaneously, wherein each of the multiple users is allowed to work on the data item serially; transmitting the copy of the data item to the multi-touch platform after the user has worked on the copy of the data item; and verifying the changes are acceptable to the user and notifying the multiple users prior to the multi-touch platform finalising the data item.
  52. 52. A mobile device comprising a processor, a communications interface, a memory and a touch screen, the processor is connected to the communications interface, the memory and the touch screen, wherein the memory comprises computer instructions stored thereon, which when executed by the processor, causes the processor to perform the method according to any one of claims 45 to 51.
  53. 53. A computer readable medium comprising computer instructions stored thereon, which when executed by a processor, causes the processor to perform the method according to any one of claims 45 to 51.
GB1519274.3A 2015-03-06 2015-10-30 Multi-user information sharing system Withdrawn GB2536090A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SG10201501720UA SG10201501720UA (en) 2015-03-06 2015-03-06 Multi user information sharing platform

Publications (2)

Publication Number Publication Date
GB201519274D0 GB201519274D0 (en) 2015-12-16
GB2536090A true GB2536090A (en) 2016-09-07

Family

ID=55085874

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1519274.3A Withdrawn GB2536090A (en) 2015-03-06 2015-10-30 Multi-user information sharing system

Country Status (3)

Country Link
GB (1) GB2536090A (en)
SG (1) SG10201501720UA (en)
WO (1) WO2016144255A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10931733B2 (en) 2017-08-24 2021-02-23 Re Mago Ltd Method, apparatus, and computer-readable medium for transmission of files over a web socket connection in a networked collaboration workspace
US11243674B2 (en) * 2018-07-10 2022-02-08 Seiko Epson Corporation Display apparatus and image processing method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE546041C2 (en) * 2020-05-11 2024-04-23 Hm Group Ab Apparatus and method for supporting touch input events

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070106942A1 (en) * 2005-11-04 2007-05-10 Fuji Xerox Co., Ltd. Information display system, information display method and storage medium storing program for displaying information
US20100083109A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method
US20100095233A1 (en) * 2006-10-13 2010-04-15 Charlotte Skourup Device, system and computer implemented method to display and process technical data for a device in an industrial control system
US20130069860A1 (en) * 2009-05-21 2013-03-21 Perceptive Pixel Inc. Organizational Tools on a Multi-touch Display Device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4332964B2 (en) * 1999-12-21 2009-09-16 ソニー株式会社 Information input / output system and information input / output method
US20050183035A1 (en) * 2003-11-20 2005-08-18 Ringel Meredith J. Conflict resolution for graphic multi-user interface
US7698660B2 (en) * 2006-11-13 2010-04-13 Microsoft Corporation Shared space for communicating information
US20080192059A1 (en) * 2007-02-09 2008-08-14 Microsoft Corporation Multi-user display
US20110163944A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Intuitive, gesture-based communications with physics metaphors
US8502789B2 (en) 2010-01-11 2013-08-06 Smart Technologies Ulc Method for handling user input in an interactive input system, and interactive input system executing the method
US20120249463A1 (en) * 2010-06-04 2012-10-04 Smart Technologies Ulc Interactive input system and method
US9479548B2 (en) 2012-05-23 2016-10-25 Haworth, Inc. Collaboration system with whiteboard access to global collaboration data
US10817842B2 (en) * 2013-08-30 2020-10-27 Drumwave Inc. Systems and methods for providing a collective post

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070106942A1 (en) * 2005-11-04 2007-05-10 Fuji Xerox Co., Ltd. Information display system, information display method and storage medium storing program for displaying information
US20100095233A1 (en) * 2006-10-13 2010-04-15 Charlotte Skourup Device, system and computer implemented method to display and process technical data for a device in an industrial control system
US20100083109A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method
US20130069860A1 (en) * 2009-05-21 2013-03-21 Perceptive Pixel Inc. Organizational Tools on a Multi-touch Display Device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10931733B2 (en) 2017-08-24 2021-02-23 Re Mago Ltd Method, apparatus, and computer-readable medium for transmission of files over a web socket connection in a networked collaboration workspace
US11243674B2 (en) * 2018-07-10 2022-02-08 Seiko Epson Corporation Display apparatus and image processing method

Also Published As

Publication number Publication date
GB201519274D0 (en) 2015-12-16
WO2016144255A1 (en) 2016-09-15
SG10201501720UA (en) 2016-10-28

Similar Documents

Publication Publication Date Title
US11570219B2 (en) Method, apparatus, and computer readable medium for virtual conferencing with embedded collaboration tools
US20230336691A1 (en) Configuring participant video feeds within a virtual conferencing system
US9998508B2 (en) Multi-site screen interactions
US9398059B2 (en) Managing information and content sharing in a virtual collaboration session
CA2892614C (en) System and method for managing several mobile devices simultaneously
US9749367B1 (en) Virtualization of physical spaces for online meetings
JP5879332B2 (en) Location awareness meeting
CN109348160B (en) Electronic tool and method for conferencing
US20130346858A1 (en) Remote Control of Audio Application and Associated Sub-Windows
US10050800B2 (en) Electronic tool and methods for meetings for providing connection to a communications network
US20090037827A1 (en) Video conferencing system and method
JP5775927B2 (en) System, method, and computer program for providing a conference user interface
CN111258521B (en) Conference interface display method, device and system, storage medium and electronic equipment
US20160191576A1 (en) Method for conducting a collaborative event and system employing same
US11288031B2 (en) Information processing apparatus, information processing method, and information processing system
US10809883B2 (en) Shared inter-operational control among multiple computing devices
US10965480B2 (en) Electronic tool and methods for recording a meeting
US11083292B2 (en) Electronic desk
JP5826829B2 (en) Recording and playback at meetings
GB2536090A (en) Multi-user information sharing system
KR101640169B1 (en) Touch-table system and providing method thereof
JP2020198078A (en) Information processing apparatus, information processing system, and information processing method
WO2014039680A1 (en) Digital workspace ergonomics apparatuses, methods and systems
MacKenzie LACOME: Early evaluation and further development of a multi-user collaboration system for shared large displays

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)