GB2507997A - Collaborative interactive devices with display content dependent on relative position - Google Patents

Collaborative interactive devices with display content dependent on relative position Download PDF

Info

Publication number
GB2507997A
GB2507997A GB1220647.0A GB201220647A GB2507997A GB 2507997 A GB2507997 A GB 2507997A GB 201220647 A GB201220647 A GB 201220647A GB 2507997 A GB2507997 A GB 2507997A
Authority
GB
United Kingdom
Prior art keywords
display
user
user device
access
devices
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1220647.0A
Other versions
GB201220647D0 (en
Inventor
Andrew Edwardson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Promethean Ltd
Original Assignee
Promethean Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Promethean Ltd filed Critical Promethean Ltd
Priority to GB1220647.0A priority Critical patent/GB2507997A/en
Publication of GB201220647D0 publication Critical patent/GB201220647D0/en
Priority to EP13792360.3A priority patent/EP2920681A1/en
Priority to PCT/EP2013/073985 priority patent/WO2014076256A1/en
Priority to US14/443,232 priority patent/US20150331489A1/en
Publication of GB2507997A publication Critical patent/GB2507997A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/08Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
    • G09B5/12Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations different stations being capable of presenting different information simultaneously
    • G09B5/125Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations different stations being capable of presenting different information simultaneously the stations being mobile
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/02Networking aspects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/08Biomedical applications

Abstract

A method of providing a collaborative display comprises: displaying an image on a first display associated with a first computing device; providing a second computing device having a second display; mapping the coordinates of the second display to the first display; in dependence on a current location of the second display relative to the first display, controlling the display of content on the second display. The first computing device may be an interactive whiteboard. The second computing device may be a handheld device such as a tablet or mobile phone. The second display may be located within a predetermined distance of the first display and oriented in the same orientation, such that the coordinates of each coincide at least in part. The devices may operate within a wireless network (2 in Figure 1). The devices may be used in a classroom environment, such as a teaching session or lesson; for example, the second device may display internal organs when overlapping a human torso displayed on the first device.

Description

COLLABORATIVE INTERACTIVE DEVICES
BACKGROUND TO THE INVErION;
Field of the Invention:
The present invention relates to collaborative interactive activity among a plurality of interactive devices The invention is particularly, but not exclusively, related to collaborative interactive activity in a classroom environment.
Description of the Related Art:
Interactive devices are well-known in the art, and.
include interactive whiteboards.
It is known to use interactive devices for collaborative activities. Typically collaborative activities allow multiple users to collaborate at a single interactive device, it is an aim of the invention to niorove the possibilities for interactive collaboration amongst multiple users.
SU?fl1ARY OF THE INVENTION: The invention provides a method of providing a collaborative display, comprising: displaying an image on a first display associated with a first computing device; providing a second computing device having a second display mapping the coordinates of the second display to the first display,-in dependence on a current location of the second display relative to the first display, controlling the display of content on the second display.
The step of controlling the display of content on the second display may be dependent on the three-dimensional position of the second display.
The second display may be located within a predetermined distance of the first display and orientated in the same orientation of the first display such that the coordinates of the second display coincide with at least part of the coordinates of the first display, the content displayed on the 8econd display is related to the content at those coordinates displayed on the first display.
The method may comprise calibrating the second display with the first display.
The first and second computing devices may be connected by a network, the first and second displays being associated with the first and second applications running on the first and second computer devices, where access to an application running on a device by another device is controlled by defining an access setting for the application running on the device.
The invention also provides a system f or providing a collaborative display, the system adapted to: display an image on a first display associated with a first computing device; provide a second computing device having a second display; map the coordinates of the second display to the first display; in dependence on a current location of the second display relative to the first display, control the display of content on the second display.
Embodiments of the invention are based on the principle that: Ci) a device is a hub of a network (preferably but not necessarily a personal area network); (ii) other devices can join/register with the network (preferably but not necessarily automatically); and (iii) users of registered devices can register with applications on other devices (or more specifically, an application running on a user's device can register with an application running on another device in the network) The process for applications registering with each other comprises some permission-based processing.
BRIEF DESCRIPTION OF THE FIGURES:
The invention. is now described by way of example wi Lb reference to the accompanying figures in which: Figure 1 illustrates a network comprising a plurality of networked devices; Figure 2 illustrates an exemnlary implementation of devices in a network arrangement such as Figure 1, in a classroom environment; Figures 3(a) and 3(b) illustrate exemplary process flows in an embodiment of the invention for registering in a network; Figures 4 (a) and 4 (h) illustrate information stored in a network connected device in an embodiment of the invention; Figures 5(a) and 5(b) illustrate process flows for accessing shared applications in an embodiment of the invention; Figures 6(a) and 6(b) illustrate implementation architectures of a user device and a huh device in an embodiment of the invention; Figure 7 illustrates a further exemplary implementation of devices in a network arrangement; Figure 8 illustrates a further exemplary implementation of information stored in a network device in the arrangement of Figure 7; Figure 9 illustrates a handheld device in accordance with a preferred enthodiment of the invention; Figures 10(a) and 10(b) illustrate a calibration between user devices in accordance with an embodiment of the invention; Figures 11(a) and 11(b) illustrate a preferred implementation of the invention; Figures 12(a) and 12(b) illustrate an alternative preferred embodiment of the present invention; and Figure 13 illustrates the preferred concept of the present invention.
DESCRIPTION OF PREFERRED EMBODIMENTS;
The invention is described herein by way of reference to exemplary implementations and preferred examples. In particular the invention is described in the context of collaborative working in a classroom environment, with users conprising students and/or teachers. The invention is not limited in its applicability to a classroom environment, and one skilled in the art will appreciate the broader
applicability of the invention.
Similarly use cases described herein to illustrate aspects and/or embodiments of the invention are set out in the context of educational use-cases in a classroom related environment. One skilled in the art will appreciate the broader applicability of such use cases, and alternative use cases.
The invention is described in the following in an arrangement in which a dedicated hub device is provided.
However the invention is not so limited, and in embodiments one or more devices may provide the hub functionality.
Figure 1 illustrates an exenDlary network and devices connected to such network in accordance with a preferred embodiment of the invention.
A wireless network is denoted by reference numeral 2. The reference numera]. 2 denotes the range of the wireless network such that wireless devices within the area 2 can transmit/receive signals to/from the wireless network, and devices outside the area 2 cannot connect transmit/receive signals to/from the wireless network. The invention is not limited to use in conjunction with a wireless network, and may be implemented with the provision of a wired network. However a wireless network is envisaged as the most useful implementation. in general the wireless network may be a communications network.
in a particularly preferred implementation tile wireless network is a personal area network, PAN. A PAN has particular anvantages in embodiments ci tre nvencon due to its reliability and speed of operation. In implementations where speed of communication between networked devices is critica], the use of a PAN is advantageous. However any type of wireless network may be used for implementation of the invention, according to the requirements of the implementation. The speed of the network will be implementation dependent.
Illustrated in Figure 1 within range of the wireless network 2 is a hub device 4 for this embodiment having an antenna S for wireless communication, in accordance with embodiments of the invention, as will be understood from the following description, the hub device acts as a hub for communications between other devices, In the described exemplary arrangement the hub device 4 generates and controls the wireless network 2 -Further illustrated in Figure 1 within range of the wireless network 2 is a plurality of user devices for this embodiment denoted. by reference numerals Sa, Sb, Sc each having a respective antenna ba, lob, ICc. The users devices 6a, Sb, Sc are also respectively labelled user device i, user device #2' , and user device #3' in the Figures. Each user device is preferably associated with at least one user, not shown in Figure 1, With reference to Figure 2, there is illustrated an exemplary implementation in which embodiments of the invention are described., Figure 2 illustrates, for the exemplary implementation, examples of the user devices of Figure 1.
Figure 2 illustrates a classroom including an interactive electronic whiteboard 20 which represents user device #1', a handheld interactive device 22 representing user device #2'.
an interactive table 24 representing user device#3-', and a hub device 26 with associated antenna 28 corresponding to huh device 4 illustrated as positioned on a table 30, Interactive electronic whiteboard 20 may be implemented in any number of ways, such devices beina well-known in the art, In Fiqure 2 the interactive electronic whiteboard 20 is illustrated as a device havinq a vertically-orientated interactive display surface 40 on which images are displayed and user inputs may he detected. The interactive display surtace 40 may be a touch sensitive surface and/or have an underlying electromagnetic grid for detecting an electromagnetic device on the surface. The interactive display surface 40 may he an emi.ssi.ve display surface and/or a surface onto which images are projected. in the arrangement of Figure 2 there is shown a projector device 42 positioned relative to the interactive display surface 40 by a boom arm 44, for projecting images onto the display. A computer system, illustrated by reference numeral 45 in Figure 2, may he connected to the interactive electronic whitehoard 20 for control purposes.
in Figure 2 a single user 32 is illustrated as providing inputs at the surface 40. it is known in the art that multiple users may provide inputs at a surface of an interactive electronic whiteboard, and the single user 32 of Figure 2 is exemplary. A single user may also provide multiple inputs, for example by the use of two hands to provide touch inputs.
Embodiments of the invention are not limited to any particular tvne of interactive whiteboard or vertical interactive surface.
Handhei.d interactive device 22 may he implemented in any number of ways, such devices being well -known in the art. In b Figure 2 the hanciheld device 22 is illustrated as a tablet device, and has an interactive display surface 42 which is an eniissive display and on which images are displayed and at which touch inputs can he detected. The handheld interactive device 22 may also be a mobile telephony device, In Figure 2 a single user 34 is illustrated as providing inputs at the surface 42. It is more typical for such a device tiiat a single user will provide inputs. Such a device may be considered a personal device. A single user may also provide multiple inputs, for example by the use of two fingers to provide touch inputs.
Interactive table 24 may he implemented in any number of ways, such devices being well -known in the art. In Figure 2 the interactive table is illustrated as having a horizontal. ly disposed interactive display surface 46 which is an emissive displaiy and on which images are displayed and at which touch inputs can he detected.
In Figure 2 two users 36 and 38 are illustrated as provid.ina inputs at the surface 46, it is known in the-ai-t that one or more users may provide inputs at a surface of an interactive table, and the two users 36 arid 38 of Figure 2 are exemplary. A single user may also provide multiple inputs, for example by the use of two hands or two fingers to provide touch inputs.
The exemplary hub device 26, for the purpose of the described examples, is a computing device having wireless access point functionality, to provide the wireless network hosting and control to allcw the various user devices of Figure 2 to communicate in accordance with embodiments of the invention, as described further below. However the functionality provided by the huh device 26 may be provided by one of the user devices, and a dedicated hub device is not a requirement, For the purposes of explanation, however, a dedicated hub device is described, The wireless network 2 provides a communication network for connecting a plurality of computing devices comprising in the described embodiment the user devices illustrated in Figure 2 (and Figure 1) . The communications network is established tinder the control of one of the computino device, such as a dedicated hub device as illustrated in Figure 2 (and Figure 1) where it is provided.
At least one further user device is connected to the network 2 by registering with the hub device 26, The at least one further user device may be any one of the illustrated user devices. Thus in order to join the wireless network 2, each user device must register with the huh device. With reference to Figure 3 (a) , a process at the hub device 25 for establishing the wireless network and registering a user device with the network is illustrated, and with reference to Figure 3(b) a process for a user device to register with the hub is illustrated.
In a step 50 of Figure 3(a), the hub device 26 is enabled. On being enabled, as denoted by step 52 of Figure 3(a), the wireless network 2 is established. It can be noted that the hub device 26 is not required to be additionally connected to any other network. The purpose of the network established by the hub device is to interconnect the user devices of Figure 2 (and Figure 1).
with reference to Figure 3(b), in a step 51 a user device having wireless capability is switched on and set to detect the presence of wireless networks as known in the art. In a step 53 it is determined whether a network is detected. If not, then the process iterates through steps 51 and 53. If a wireless network is detected, then in a step 55 the user device sends a request to join the network to the hub device.
With reference to Figure 3(a) . the hub device receives the request to join in a step 54.
In accordance with known techniques, the steps 51 to 55 of Figure 3(b) may be initiated manually under the control of the user of the user device, In an alternative arrangement the steps 51 to 55 of Figure 3(b) may occur automatically without user control. In a particular preferred embodiment, the identity of the wireless network established by the hub device 26 may include an identifier identifying the wireless network of being a certain type. The user device may be adapted to automatically look for wireless network of that type, and attempt to register with wireless networks of that type.
In a step 56 of Figure 3(a) and a step 57 of Figure 3(b), the hub device and the user device communicate to allow an appropriate procedure for the use device to join and/on register with the wireless network 2, in accordance with a 1 C) particular implementation. As known, in the art, this may involve the user device being required to provide a password, As known i.n the art-Lie user device may have registered with tile network previously, and have a pr&-stored password, or the user may have to manually provide a password. The mechanism by which a user device connects with the wireless network is outside the scone of the present invention.
On successful registration of the user device with the network, the hub device transmits an acknowledgement that the request has been accepted and. the user device has joined the network to the user device, as denoted by step 58 in Figure 3 (a) As denoted by step 59 in Figure 3 (b) the use device receives the acknowledgement as a conformation that it has joined the network from the hub device.
in. accordance with the invention and its embodiments, functionality i.s provided to allow each user device to make available content arid/or control of applications running on the user device to other user devices in the network. Each user device is associated with one or more açplications, and each user device may allow access to is application by other user devices, and may access the applications of other user devices,Such access is preferabl.y controlled, such that there are access settings associated with each application or application content which permit/restrict access.
Thus, an application is run on at least one of the computing devices: and access to an application running on a device by other devices is controlled by defining access settings jLor each application running on each device, as will now be further described.
An example is described.
it is assumed that user device 41'. in the exemplary scenario the interactive electronic whiteboard 20 of Figure 1,
II
is registered with the network 2. A number of applications are running on the interactive electronic whiteboard 20, under the control of the computer 46 with which it is associated. For example, two applications 1KW application #1' and 1KW application #2' are running on the interactive electronic whiteboard 20.
With reference to Figures 4(a) and 4(b), there are illustrated tables associated with the interactive electronic whiteboard 20 and reflecting in particular the application status of the interactive electronic whiteboard 20.
As shown in Figure 4(a) a first table 60 represents applications running on interactive electronic whiteboard itself.
In a first column 62 headed own applications' there is listed an identity of the current applications running on the interactive electronic whiteboard 20. This column lists in two rows tRW application #1' and IEW application #2'.
In a second column 64 of the first table there is illustrated the access permissions associated with each of the interactive electronic whiteboard applications. The interactive electronic whiteboard 20 may set the access permissions according to any number of criteria. Por example access may be public, requiring no authorisation. Access may be restricted by password. Access may be restricted by type of device. Access may be restricted by type of user.
In a third column 66 of the first table there is illustrated the user devices that are currently provided with access to each of the applications.
In a fourth column 68 of the first table there is illustrated the type of access that the user devices in the third column have. The access may, for example, be restricted to read only', or may allow control of the application.
Access to an application running on a device may allow one device to assume control of another.
As shown in Figure 4 (b) a second table 70 represents applications running on other user devices that the interactive electronic whiteboard has been granted access to, In a first column 72 headed shared applications' there is listed an identity of the current applications running on other user devices and to which the interactive electronic whitehoard 20 has access.
In a second column 74 o.f the second table there is ii lustrated the identities of the user device associated w tt each of the shared applications for the interact±ve electronic whiteboard 20.
In a third column 76 of the second table there is illustrated the type of access which the interactive electronic whitehoard 20 is permitted for each application.
The access may, for example, be restricted to 1read only' , or may allow control of the application.
It will he understood that each user device connected to the wireless network will typically have tables as shown associated in Figure 4 (a) and 4 (b) associated therewith.
Whilst the exact tables shown may not he provided, functionality consistent with these tables will be provided..
The creation and management of exemplary tables as show-n in Figures 4 (a) and 4 (b) is now further described with reference to Figures 5 (a) and 5 (b) It is assumed that each of the user devices shown in Figure 2 i.s present in vicinity of the network and connected to the network. Figure 5(a) illustrates the process, once registered with the network, associated with accessing applications running on devices in the network from the perspective of a user device requesting access. Figure 5b) illustrates the process from the perspective of a user device running the application to which access is requested.
In a step 80 of Figure 5(a), followIng registration in the network, the user device receives a list of devices connected in the network and applications running on those devices which are available for access, In a preferred arrangement the hub device stores a mirror of the tables of Figures 4 (a) and 4 Kb) of each user device, and provides the list to of availability devices and applications based on the
mirrored tables,
Preferabl.y the in formation provided to a user device also includes the access information, associated with each application. In one embodiment the hub device provides a list of all the applications on all the devices to the user devices. In other embodiments the hub device may only provide a list of the applications to which a user device receiving the list has access.
Preferably the infcrmat.ion provided to a user device also includes the type of access permitted.
in a step 82 of Figure 5 Kb) , the user device then selects a desired application. In a tytical implementation, the selection will take place by a user selecting the application from a list or information displayed in a user interface of the user device.
2S In a step 81 of Figure. 5(b) , a user device on which an application is runnino, which may be referred to as a host cevice, monitors for requests to access the application. It s assumed for this example the user device associated with the process of Figure 5(a) selects in step 82 an application which is running on a user device associated with the process of Figure 5(b), a request for access to which is received in Step 83.
In a step 85 of Figure 5 (b) the host device determined whether the application to which access is requested has access setting such that access is public, i.e. unrestriced If so, then access is allowed in step 87.
If access is not public, then in a step 89 of Figure 5(b) it is determined wheth r access to the application is allowed for devices of a particular type or characteristic. If so, and the device making the request meets the characteristic, then access is allowed in step 87, If access is not determined based on. a. device type or characteristic, or if the device making the request is not of the correct trne or does not possess the correct characteristic, in step 93 of Figure 5(b) it is determined whether access is password protected.
If so, then in step 95 of Figure 5(b) the host device transmits a request for the password to the requesting device, In a step 84 of Figure 5 (a) , the reuesting device determines that a request for a password has been received, and then in step 86 transmits a password.
In a step 97 of Figure 5(b) the host device determines if the requesting device has transmitted a password. If so, tn in step 99 it is determined whether the password is correct.
If so, the process in the host device in F igure 5 (B) progresses to step 87, and if not the process progresses to step 101.
If in step 93 it is determined that the application is not associated with password access, then the process in the host device moves on to step 101.
In stetD 101 It is determined to relect the access $0 recuest. In step 87 it is d cermined to allow the access request. Following either of steps 101 and 87 the process in the host device proceeds in Figure 5 (h) to step 103, and a notification concerning the request is transmitted to the requesting device. If the request has been allowed, following step 87 the host device also updates its own applications table, corresponding to Figure 4(a).
In step 88 of Figure 5(a), the requesting device awaits notification from the host device, and upon receipt in step determines if the request has been allowed. If the request has been allowed, the requesting device updates its shared applications table, corresponding to Figure 4(b). If the request is not allowed, then following step 90 in Figure 5(a) the process in the requesting device is terminated in step 92.
The process described with reference to Figures 5(a) and 5(b) may be dynamic, with each user device receiving updates information as other user devices open and close applications.
A user device may receive a list of current available applications by sending a request to the hub device, the selection of a further application at any time following the process described hereinabove.
In the event that any change is made to a table of any user device corresponding to the table so Figure 4(a) and 4(b), the user device preferably transmits an update message to the hub device to notify the hub device of the change.
When a user device is first connected in the network, a list of applications running on the user device is provided to the hub device so as it may be made available to the other user devices.
Figure 6(a) illustrates an exemplary architecture of a hub device in accordance with an embodiment of the invention, such as hub device 4 of Figure 1.
Figure 6(b) illustrates an exemplary architecture of a user device in accordance with an embodiment of the invention, such as one of the user devices Ga, 6b, 6c of Figure 1.
In the foregoing embodiment there has been described an arrangement in which a distinct hub is provided. This is described for illustration purposes, and in other embodiments a distinct hub is not provided. The functionality of the hub may be provided in a user device, or may be distributed between multiple user devices. In a particular embodiment the functionality of the hub may be dynamic, being provided by different user devices at different times, and/or spread amongst different user devices at different times. En practice the function of the hub may be achieved by a server, and the operation of the server may be provided on one user device or distributed on several user devices.
In the foregoing embodiment art arrangement is described in which applications are associated with the physical user devices with which they are assisted. This is achieved, in the foregoing embodiment, by the use of tables associated with individual user devices. In practice, such an association may be onerous.
In general, an application which may be controlled or accessed by one or more users may be termed an object, and control of the application or object may be termed a session.
In an alternative embodiment the object may be hidden or not hidden on the server. The session which controls the object can be moved between user devices. For example, if a current host leaves the classroom, and hence the wireless area, another user device may take responsibility for hosting the session: i.e. the session moves to another user device. In such a scenario, each session has a unique identifier which identifies which user device is currently the host for the session.
This may involve the user device that is leaving sending a message to the network requesting another user device to assume responsibility for the session, when a user device is to leave a classroom, f or example, whichever user device within the classroom that receives and responds to the message may take the session over for the device that is leaving.
However the default operation may be that the session will die'. The message from a user device that. a user device is leaving may go to all user devices, those user devices that have access to the session, or one particular user device according to implementation requirements. For example, there may be master server within the network which will receive the message. This master server may define rules for the session.
These rules may state that a user device should always pass on a session to the next available server. In a preferred embodiment, a master user device for a session is the user device which currently hosts the session, so the master user devices are distributed. When a user device leaves, the rules for that user device may determine the user device to which the session is to be transferred.
A current host device has the ability to change the rules. The rules may define any characteristics of the session.
The server does not have to be a host device.
The functionality of a server session and a user device are preferably separated.
In a preferred embodiment the servers may have the ability to see each other, but sessions may not. Sessions can only be exchanged from one user device to another is access rights permit such.
Sessions can be hierarchical. Sessions are an efficient way to keep connection of user devices managed and manage the joining of existing devices.
In embodiments, there may be provided multiple sessions for each user, with different access levels for each user.
Figures 7 and 8 describe a further embodiment of the invention consistent with the foregoing. However one skilled in the art will appreciate that the implementation of the invention may he achieved inanumber of ways.
With reference to Figure 7, in this illustrative example there are three user devices Ga, 6b, 6c labelled user device fl', user device #2' and user device $3' Each user device potentially has one or more applications associated with it, The applications may he any software which can run on the device, including software which controls an interactive whiteboard. In the example shown, the first user device Ga has one application termed appl 1' denoted by reference numeral Ia, the second user device Lb has two apolications apri 2' and app 4' denoted by reference numerals lb and ld, and the third user device Sc has one application appl 3' denoted by reference numeral lc.
In accordance with thi.s embodiment of the invention; each user device is the host for the session associated with an application (object) running on that user device, and has a table which defines the access parameters for t:nat application. In addition, the table for each user device defines, the permissions for that user device to access objects of other user devices. This is illustrated in Figure 8 with example to the user device $2, denoted by reference numeral Gb.
Figure 8 shows an exemplary table for the second user device Sb. It can he seen that the table has headings object', session host', user(s)', and tyte of access' The object' column lists every applicatior. to which that user device has access. There may be other ap.lications in the network, but only those applications are listed to which the user device has access. For eacn application or object, the session host' column defines the user device which is the host for that object. In accordar.ce with this embodiment, the host will be the user device on which the application is running. The user(s) coluirn defines those user devices which are associated with that application, and the type of access' column defines what type of access is permitted for that user.
As shown, the type of access may vary, and will be W dpermied by the cress reauested and th ho device For example, only one device may have ful.l contrc.l of an application at any time, and 011CC full control is given any further user device requesting access is given only road-only rights.
The type of access may be more sophisticated. For example, the type of access may define that a user device is allowed full access ance full access i.e completed by another device. The type of access may define that full access is given to another device once full access is completed by another device, IThe tvte of access may also define sending messages to user devices t.o adnse tnat a type ot access for tne user is changed or become available, An important aspect of the present invention is that rules are defined within a session f-or an object. These rules may be reflected in the type of access' column.
A user device preferably periodically broadcasts its presence. At the same time, a user device listens for broadcast messages. This allows user devices to join networks and sessions, and for sessions to he created, without specific requests to establish sessions. When a user device first joins a network, the user may be provided with a list of objects which it can request an association with. The implementation of this will be system specific.
The invention is described herein in the context of the system architecture described in the foregoing. whilst the S invention may be advantageously implemented in such a system architecture as described above, the invention is not limited to such an architecture as described above, whilst the system architecture described in the foregoing provides a particularly advantageous architecture in which to implement the invention and maximise the benefits associated with the invention, one skilled in the art will appreciate that the invention may be implemented using other system architectures which allow for communication between devices as required by this invention and set out in the following description.
The invention concerns the use of a first user device for manipulating a second user device, based on the spatial manipulation of the first user device. The invention is described, for the purposes of understanding, in the context of a non-limiting example. The described example is adjusting or controlling the display of the interactive electronic whiteboard 20 in dependence upon its spatial relationship with the handheld user device 22.
This invention relates to controlling an image on a handheld display in dependence on the spatial positioning of the handheld display with respect to the main display, the image on the handheld display being related to the image displayed on the main display.
The handlield user device 22 is adapted to include circuitry which allows its movement to be determined. This may include the handheld user device 22 being adapted to include a gyroscope. As illustrated in Figure 9, the handheld user device 22 includes in its system architecture a gyroscope 124, an accelerometer 126, and a corn ass 128 each of which provides a sianal to the device orocessor 122, The provision of circuitry to determine movement witbn the handheld user device 22 is known. in the art. it is known to provide handheld devices, such as mobile telephones for example, which allows movement of the device to be detected in order to control applications running on the handheld device.
The circuitry provided.i.n the handheld user device can be used to determine movement of the handheld device, For example. a gyroscope and an accelerom-ter provide a six axis interpretation of movement through space. This is especially useful in small handheld devices such as mobile telephones, as it can filter the unintended ambient movement and vibration of a user's hand, allowing a more accurate measurement of intentional movements. An accelerometer is used to measure sudden acceleration within a cetain range of motion. A gyroscope works by interpreting the shift in positioning from a set of rotation within the X, Y, Z axes. When a gyroscope and accelerometer are combined, it is possible to simuitaneous].y measure acceleration and gravitational placement in the X,Y,Z axes. This combination results in a total of six orientation measurements at all times.
In the described example, an application is running on the intactive electronic whiteboard 20 which controls the display of images on the display 40, in the described example, an application is running on the handheld user device 22 for determining movement and spatial positioning of the handheld user device.
In the described example, the interactive electronic whiteboard 20 and the handheld user device 22 are registered in the network 2, and the interactive electronic whiteboard 20 has ailowed tne nanoheld user devce access to control an application running on the interactive electronic whiteboard which controls the display.
There is provided a method of provided a collaborative display, which comprises displaying an image on a first display associated with a first computing device; providing a second computing device having a second display; mapping the coordinates of the second display to the first display; in dependence on a current location of the second. display relative to the first display, controlling the display of content on the second display.
The invention works on the principle of providing applications with the ability to communicate coordinate changes to each other to facilitate novel ways to reveal or trigger information.
In accordance with an embodiment of the invention, the spatial positioning between the handhelcl user device 22 and the display of the interactive electronic whitehoard is calibrated in order to allow the relative spatial positioning of tIe two, and changes therein, to be monitored and used to control one or the other device.
In a preferred embodiment, the computer controlling the electronic whiteboard projects a series or coloured rectangles onto the display 40 at predetermined locations in a predetermined sequence. By projecting the calibration markers as coloured rectangle, a camera fitted to the handheld user device 22 can be utilised in the calibration process.
With reference to Figure 10 (a), there is illustrated the display 40 of the interactive electronic whiteboard 20, and on which four rectangles l3Oa, 13Gb, l3Ocq and 130d are shown, one In each corner of the display. The rectangles may be displayed in a sequence, one at a time, or may he displayed simultaneously.
In order to carry out the. calibration process; the user must align the handheld user device 22 with each displayed rectangle 130a to 13Db. As shown in Figure lO(b) the handheld user device 22.is aligned with each rectangle 130a to 130d in turn. Aithouch the handheld device is shown in each of the four l.ocat ions in Figure 10 (b), the handhold device can only be in one position at a time. The haridheld device may be required to be located at each position in a particular sequence, or may be positioned at each location in any sequence.
The haridheid device 22 is preferably intended to be aligned with the displayed rectangles such that a camera on the handheld device is aligned with them and may capture an image of the displayed rectangle.
To aid the alignment process, the interactive electronic whiteboard may transmit the relative calibration coordinates, the rectangle dimensions, and the screen resolution to the hand held user device. The user will then position the hand held user device at each of the calibration points, and line up the coloured rectangles using guide lines rovided by the application.
The first point sampled will be defined as the origin of the spatial axes The remaining points will then be tracked using the built in motion sensors, arid define locations based on the origin, Once thi.s exercise is complete, a suitable projection matric can be built using the combined IWB calibration coordinates and the device coordinates. This projection matrix will form the basis of the mapping.
The hand held user device's on board motion sensors keep track ot the movement of the handheld device, and feed these movements back to the projection matric, An inverse transform provides a 1:1 mapping back to the interactive electronic whiteboard.
As noted above, the inve.ntior. works on the principle of providing applications with the ability to communicate coordinate changes to each other to facilitate novel ways to reveal or trigger information.An exemplary application of the spatial positioning control and manipulation is now described.
This relates to an example where a teaching session is taking place in a classroom relating to human biology.
In such a teaching example, a lesson, activity is running which involves the interactive electronic whiteboard 20, and one or more handheld user devices 22. As can be seen in Figure 11(a) an image of a male human torso 132 is being displayed on the display 40. Lesson details 134 are being displayed on the display 42 of the handheld device 22.
The applications running on the handheld device and the interactive electronic whiteboard communicate with each other2 having being configured to allow respective access.
A spatial mapping service application will additionally be running. This may he running on the interactive electronic whiteboard 20, the handheld device 22, or the hub device 2G.
Coordinates provided by the spatial mapping service will be sent to the electronic interactive whiteboard application controlling the display on the interactive electronic whlteboard, and it in turn will send commands to the application running on the handheld device. Thus the application running on the interactive electronic whitehoard is the parent (or master) application, and the application running on the hand held device 22 is the child (or slave) application.
In the described example, the application is configured such that when it is determined that th.e handheld device is positioned over the image on the display of the interactive electronic whitehoard, a command is sent to the hand held device to reveal the relevant rectangle of another image.
In the example of Figure 12 (a) the hand held device displays the internal organs associated with the position of the torso with* which the handheld device is currently overlaid As the user moves the haudheld device around, certain trigger points may be hit, These trigger points may invoke an interactive semment.
With reference to Figure 12 (h) , the handheld device has been located over a trigger point where the display asks the user to. ident±.fy the displayed internal organs. The user then provides input on their device.i.n response.
Fi ure 13 illustrates the principle of-the present invention, showing in broad outline a first user device 20 and a second user device 22, In accordance with preferred embodiments, the first user device 20 is taken to be an interactive whiteboard, and the second user device 22 is taken to he a handheld user device.
As shown in Fioure 13, each corner of. the handheld user device 22 is mapped to a corner of the interactive whiteboard 20, as denoted by reference numerals boa to lood. A..s such the corners of the handheld device are "attached" to the interactive whiteboard, such that as the handheld device 22 is moved, the interactive whiteboard device detects that movement and maps it to an associated part of the display.
The mapoing from corners of the interactive handheld uqer device 22 to the interactive whiteboard device 20 as denoted by reference numerals iOOa to lood effectively constitute "strings" attached between the devices, such that any movement to the handheld device 22 is mapped to the display of the interactive whiteboard 20.
There has thus been described a collaborative interactive system in accordance with various embodiments. One skilled in the art will appreciate that different astects of different emDodlments may be combined in order to acflieve the present invention, The present invention is not limited to aspects of the foregoing embodiments as set out. One skilled in the art will appreciate that the invention may he implemented in d.ifl:erent ways. Th.e protection afforded by the present invention is set out in the appended claims,

Claims (10)

  1. CLAIMS: 1. A method of provided a collaborative display, comprising: displaying an image on a first display associated with a first computing device; providing a second computing device having a second display; mapping the coordinates of the second display to the first display; in dependence on a current location of the second display relative to the first display, controlling the display of content on the second display.
  2. 2. The method of claim 1 wherein the step of controlling the display of content on the second display is dependent on the three-dimensional position of the second display.
  3. 3. The method of claim 1 or claim 2 wherein when the second display is located within a predetermined distance of the first display and orientated in the same orientation of the first display such that the coordinates of the second display coincide with at least part of the coordinates of the first display, the content displayed on the second display is related to the content at those coordinates displayed on the first display.
  4. 4. The method of any preceding claim further comprising calibrating the second display with the first display.
  5. 5. The method of any preceding claim wherein the first and second computing devices are connected by a network, the first and second displays being associated with the first and second applications running on the first and second computer devices, where access to an application running on a device by another device is controlled by defining an access setting for the application running on the device.
  6. 6. A system for providing a collaborative distlay, the system adapted to: display an image on a first display associated with a first computing device; provide a second computing device having a second display; map the coordinates of the second display to the first display; in dependence on a current location of the second display relative to the first display, control the disol2.y of content on the second display.
  7. 7. The system of claim 6 wherein the step of controlling the display of contei on the second display is deoendent on the three--dimensional position of the second display.
  8. 8. The system of claim 6 or claim 7 wherein when the second display is located within a predetermined distance of the first display and orientated in the same orientation of the first display such that the coordinates of the second display coincide with at least part of the coordinates of the first display? the content displayed on the second display is related to the content at those coordinates displayed on the first display.
  9. 9. The system of any one of claims 6 to B further comprising calibrating the second display with the first display.
  10. 10. The system of any one of claims 6 to 9 wherein the first and second computing devices are connected by a network, the first and second displays being associated with the first and second applications running on the first and second computer devices, where access to an application running on a device by another device is controlled by defining an access setting for the application running on the device.
GB1220647.0A 2012-11-16 2012-11-16 Collaborative interactive devices with display content dependent on relative position Withdrawn GB2507997A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
GB1220647.0A GB2507997A (en) 2012-11-16 2012-11-16 Collaborative interactive devices with display content dependent on relative position
EP13792360.3A EP2920681A1 (en) 2012-11-16 2013-11-15 Collaborative interactive devices
PCT/EP2013/073985 WO2014076256A1 (en) 2012-11-16 2013-11-15 Collaborative interactive devices
US14/443,232 US20150331489A1 (en) 2012-11-16 2013-11-15 Collaborative interactive devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1220647.0A GB2507997A (en) 2012-11-16 2012-11-16 Collaborative interactive devices with display content dependent on relative position

Publications (2)

Publication Number Publication Date
GB201220647D0 GB201220647D0 (en) 2013-01-02
GB2507997A true GB2507997A (en) 2014-05-21

Family

ID=47521281

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1220647.0A Withdrawn GB2507997A (en) 2012-11-16 2012-11-16 Collaborative interactive devices with display content dependent on relative position

Country Status (4)

Country Link
US (1) US20150331489A1 (en)
EP (1) EP2920681A1 (en)
GB (1) GB2507997A (en)
WO (1) WO2014076256A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10304037B2 (en) 2013-02-04 2019-05-28 Haworth, Inc. Collaboration system including a spatial event map
US11861561B2 (en) 2013-02-04 2024-01-02 Haworth, Inc. Collaboration system including a spatial event map
JP6439801B2 (en) * 2014-08-26 2018-12-19 株式会社リコー Session control system, communication system, session control method, and program
WO2016179401A1 (en) 2015-05-06 2016-11-10 Haworth, Inc. Virtual workspace viewport follow mode and location markers in collaboration systems
CN104834448B (en) * 2015-05-27 2018-12-14 联想(北京)有限公司 A kind of control method and electronic equipment
US10255023B2 (en) 2016-02-12 2019-04-09 Haworth, Inc. Collaborative electronic whiteboard publication process
US11934637B2 (en) 2017-10-23 2024-03-19 Haworth, Inc. Collaboration system including markers identifying multiple canvases in multiple shared virtual workspaces
US11126325B2 (en) * 2017-10-23 2021-09-21 Haworth, Inc. Virtual workspace including shared viewport markers in a collaboration system
WO2020176517A1 (en) 2019-02-25 2020-09-03 Haworth, Inc. Gesture based workflows in a collaboration system
US11750672B2 (en) 2020-05-07 2023-09-05 Haworth, Inc. Digital workspace sharing over one or more display clients in proximity of a main client

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070273609A1 (en) * 2006-05-25 2007-11-29 Fujifilm Corporation Display system, display method, and display program
US20090213032A1 (en) * 2008-02-21 2009-08-27 Newport William T Computer System Having Shared Display Devices
US20100287485A1 (en) * 2009-05-06 2010-11-11 Joseph Bertolami Systems and Methods for Unifying Coordinate Systems in Augmented Reality Applications
US20100287513A1 (en) * 2009-05-05 2010-11-11 Microsoft Corporation Multi-device gesture interactivity
US20110050544A1 (en) * 2009-08-25 2011-03-03 Brother Kogyo Kabushiki Kaisha Image display system including a plurality of electronic paper display devices

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7313120B2 (en) * 2003-09-16 2007-12-25 Nokia Corporation Application control in peer-to-peer ad-hoc communication networks
US20130080932A1 (en) * 2011-09-27 2013-03-28 Sanjiv Sirpal Secondary single screen mode activation through user interface toggle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070273609A1 (en) * 2006-05-25 2007-11-29 Fujifilm Corporation Display system, display method, and display program
US20090213032A1 (en) * 2008-02-21 2009-08-27 Newport William T Computer System Having Shared Display Devices
US20100287513A1 (en) * 2009-05-05 2010-11-11 Microsoft Corporation Multi-device gesture interactivity
US20100287485A1 (en) * 2009-05-06 2010-11-11 Joseph Bertolami Systems and Methods for Unifying Coordinate Systems in Augmented Reality Applications
US20110050544A1 (en) * 2009-08-25 2011-03-03 Brother Kogyo Kabushiki Kaisha Image display system including a plurality of electronic paper display devices

Also Published As

Publication number Publication date
GB201220647D0 (en) 2013-01-02
US20150331489A1 (en) 2015-11-19
WO2014076256A1 (en) 2014-05-22
EP2920681A1 (en) 2015-09-23

Similar Documents

Publication Publication Date Title
GB2507997A (en) Collaborative interactive devices with display content dependent on relative position
US10642567B2 (en) Multiplatform based experience generation
US9852546B2 (en) Method and system for receiving gesture input via virtual control objects
Pradhan et al. Websigns: Hyperlinking physical locations to the web
US20180096450A1 (en) Shared virtual reality
JP2020039880A (en) Information processing method, terminal, and computer storage medium
EP2579128B1 (en) Portable device, virtual reality system and method
US8593535B2 (en) Relative positioning of devices based on captured images of tags
US20140062874A1 (en) Client device orientation
WO2009102138A2 (en) Tabletop, mobile augmented reality system for personalization and cooperation, and interaction method using augmented reality
WO2014119098A1 (en) Information processing device, terminal device, information processing method, and programme
EP3547083B1 (en) Information processing program, information processing method, and information processing system
US20120327112A1 (en) Multi-Modal, Geo-Tempo Communications Systems
US9313290B2 (en) Data transfer between devices
JP2012168646A (en) Information processing apparatus, information sharing method, program, and terminal device
CN108762482A (en) Data interactive method and system between a kind of large screen and augmented reality glasses
EP2946307A2 (en) Appliance control system and method
WO2019117583A1 (en) Spatial messaging and content sharing method, and system therefor
GB2472406A (en) Controlling user input in a computers system (e.g. an interactive learning system)
JP5661835B2 (en) Terminal device, display processing method, and display processing program
WO2014076255A1 (en) Network based collaborative interactive activity
CN102770836B (en) Method for controlling motions of an object in a 3-dimensional virtual environment
US11189097B2 (en) Simulated reality transition element location
US11430193B1 (en) Resilient interdependent spatial alignment to improve and maintain spatial alignment between two coordinate systems for augmented reality and other applications
JP2019020849A (en) Server device, electronic content management system and control method

Legal Events

Date Code Title Description
S30Z Assignments for licence or security reasons

Free format text: APPLICANT PROMETHEAN LIMITED SECURITY AGREEMENT BURDALE FINANCIAL LIMITED

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)