WO2008007260A2 - Nfc enabled pointing with a mobile device - Google Patents

Nfc enabled pointing with a mobile device Download PDF

Info

Publication number
WO2008007260A2
WO2008007260A2 PCT/IB2007/052293 IB2007052293W WO2008007260A2 WO 2008007260 A2 WO2008007260 A2 WO 2008007260A2 IB 2007052293 W IB2007052293 W IB 2007052293W WO 2008007260 A2 WO2008007260 A2 WO 2008007260A2
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
mdl
near field
communication unit
system
Prior art date
Application number
PCT/IB2007/052293
Other languages
French (fr)
Other versions
WO2008007260A3 (en
Inventor
Hans M. B. Boeve
Teunis J. Ikkink
Kornelis J. Wouda
Original Assignee
Nxp B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to EP06115937 priority Critical
Priority to EP06115937.2 priority
Application filed by Nxp B.V. filed Critical Nxp B.V.
Publication of WO2008007260A2 publication Critical patent/WO2008007260A2/en
Publication of WO2008007260A3 publication Critical patent/WO2008007260A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices
    • H04W88/06Terminal devices adapted for operation in multiple networks or having at least two operational modes, e.g. multi-mode terminals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/02Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
    • H04W84/10Small scale networks; Flat hierarchical networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices

Abstract

A system comprises a communication terminal (CT1) which has a near field communication unit (10) for near field communication with a mobile device (MD1) when in range of the near field communication unit (10). An output device (OD1; OD2) is positioned outside the range of the near field communication unit (10) to provide output to a user of the mobile device (MD1). A controller (11) controls the output device (OD1; OD2) to provide the output dependent on an orientation (D1) of the mobile device (MD1).

Description

NFC ENABLED POINTING WITH A MOBILE DEVICE

The invention relates to a system comprising a communication terminal, an output device and a controller. The invention further relates to a method of communicating.

The recently introduced Near Field Communication (further also referred to as

NFC) technology evolved from a combination of contactless identification and connectivity technologies. It offers an easy, intuitive way to connect devices and share data between them by just bringing them close together. Operating over short distances only, and not requiring any configuration by the user, NFC is the ideal way for consumers to interact with the connected world around them via their cell phones. NFC is standardized under ECMA-340 (ISO/IEC 18092) and operates at 13.56 MHz. It is fully compatible with the Philips MIFARE and Sony FeIiCa contactless smart card platforms.

NFC is a proximity-based, ultra low-power wireless technology which allows information to be transmitted between devices over a few centimeters. Therefore, devices must be placed intentionally close together in order to communicate. Although the transactions are inherently secured due to the short distance between the communicating devices, even more secure transactions can be guaranteed by combining the NFC with a SIM card or a smart card controller like Philips' SmartMX. This allows the cell phone to perform, for example, 3-DES encryption. Known applications of NFC include: tapping two NFC-enabled phones together to start a conference call or swap phone numbers instantly, or when shopping, a phone is put against an interesting DVD to see a short trailer, or touching a poster for a concert with the phone to put a reminder in the phone or to buy tickets.

However, the applications involved with NFC are limited due to the inherent proximity based operation. It is an object of the invention to provide a NFC system in which a mobile device is used but in which output provided to the user in response to NFC activity with the mobile device need not be in the proximity of the mobile device.

A first aspect of the invention provides a system as claimed in claim 1. A second aspect of the invention provides a method as claimed in claim 16. Advantageous embodiments are defined in the dependent claims.

A system in accordance with the first aspect of the invention comprises a communication terminal with a near field communication unit for near field communication with a mobile device when the mobile device is in range of the near field communication unit. Usually, the range wherein the mobile device and the near field communication unit are able to communicate is less than 20 cm. The system further comprises an output device positioned outside the range of the near field communication unit. Thus, the distance between the communication terminal and the output device is relatively large. A mobile device which is sufficient near to the communication terminal to start NFC is too far away from the output device to start NFC with the output device. Usually, in fact there is no need for the output device to have NFC capability. The output device provides output to the user of the mobile device, and a controller controls the output device to provide the output dependent on an orientation of the mobile device. For example, the output device may indicate a visual and/or audible feedback to the user. In this system the mobile device which is manipulated by the user, usually by keeping it in a hand, is brought near to the communication terminal such that the NFC starts. The freedom to arbitrary orient the mobile device is used to control the output dependent on the orientation of the mobile device. The mobile device is, for example, a mobile phone or a PDA (personal digital assistant). Such a system is often referred to as a host system. Mobile devices which provide orientation information are known as such. For example, the Bluewand is a small, pen-like device that can be used to control Bluetooth enabled devices by hand movements. A six-axis accelerometer and gyroscope system detects the pen's full orientation and movement in space. A Bluetooth wireless link transmits this data to any cooperating device which then executes associated commands. Possible applications include remote control of various consumer electronics, access control to machines and buildings, a virtual laser pointer, and mobile gaming.

In an embodiment as claimed in claim 2, the output device is a display apparatus which displays display information as displayed information. The displayed information may be displayed on a screen of the display apparatus. Alternatively, the displayed information may be displayed on a projection screen or a wall. The display information may be an image, a film or any other visual information. The controller receives orientation information on the orientation of the mobile device via the communication terminal. For example, the orientation information may be the heading or azimuth of the mobile device, a tilt angle in one or more dimensions, a roll, or any combination thereof. Alternatively, the orientation information may be raw data, for example indicating the orientation of the mobile device with respect to the earth's surface, now the controller has to process this raw data to obtain the orientation of the mobile device.

The controller generates a visual indicator which indicates a position in the displayed information dependent on the orientation of the mobile device. For example, a marker is indicated at a position in the display which is indicated by changing the orientation of the mobile device or by keeping the mobile device in a particular orientation. Alternatively, the orientation of the mobile device may select a particular area in the displayed information. This selection is made visible to the user, for example by highlighting the area. The highlighting may be, for example: increasing the brightness, changing the color, or drawing a boundary around the area, or any other manner indicating to the user which position in the displayed information corresponds to the actual orientation of the mobile device.

In an embodiment as claimed in claim 3, the visual indicator indicates a position in the displayed information defined by a pointing direction of the mobile device to a particular area of the displayed information. Now, the mobile device which has a pointing direction is pointing towards the displayed information, which is the most intuitive manner to use the orientation of the mobile device to indicate a position on the displayed information. Usually, the mobile device is shaped to clearly indicate to the user what its pointing direction is. For example, the mobile device may have an elongated shape which has a pointing axis.

In an embodiment as claimed in claim 4, the controller retrieves a personal identifier of the mobile device to allocate a visual indicator to this mobile device which has an appearance selected to uniquely distinct it from a further visual indicator associated with another mobile device communicating with the communication terminal or with another communication terminal of the system. The allocation of different indicators to different mobile devices allows different users to communicate with the system at the same time. Each one of the users has a particular indicator which is recognizable to the user. If the indicators are markers, the shape and/or color of the markers may be different. If the indicator is a highlighting of a particular area, for example representing a button, different colors may used for different users.

In an embodiment as claimed in claim 5, the output device is a sound transducer, such as for example a loudspeaker. The controller receives information on the orientation of the mobile device via the communication terminal. The controller activates the sound transducer dependent on the orientation of the mobile device.

In an embodiment as claimed in claim 6, the sound transducer is activated when the mobile device is pointing towards the sound transducer. For example, a story which elucidates what is displayed, starts by pointing for a predetermined period of time at the loudspeaker. The loudspeaker may be positioned in the neighborhood of displayed information such that the sound is automatically associated with the displayed information. Alternatively, the loudspeaker may be part of a headphone.

Alternatively, the orientation of the mobile device may be used to control a lighting. For example, the orientation of the mobile device may be use to dim a lamp, or to control a color of a lamp. For example, in solid-state lighting the color of a LED lamp or of a pixilated wall can be adjusted. In a pixilated wall, the pointed at position can, for example, be indicated my blinking of the LED or changing color of the LED. Alternatively, the orientation of the mobile device may by used to control the volume/pitch/balance of the sound, the intensity/color of the light (or pixel), the temperature of the room, the zoom function of a camera, or any other conceivable quantity. Such quantities may be controlled with a ID rotation around the pointing axis of the mobile device.

In a further embodiment, the output device is a user interface. Information form the system, as referred to by the user interface, is then transmitted to the mobile device to be displayed to the user on a output element of the mobile device itself. Such output element is for instance the display. In one application, the user interface is a screen with names or options. More detailed information regarding the names or options is then transmitted to the mobile device and displayed on the display. This allows the user to read the information more adequately. Such may be handy for disabled or older people, but also in relatively or during environment, of in evenings and nights. The display will then light up so as to be correctly visible. In one preferred modification hereof, with the list of names, the user may select one of the names to set up a wireless connection, such as a telephone call. The user may then request to open a door near the user interface or may request other information. In another application, the selection of a user interface will start an action. The result of this will be shown on the output element of the mobile device. This may be a telephone call form a helpdesk to the mobile device. This may further be more detailed information on the way to find a selected location within an office. In a further application, the user interface may be a security feature. For instance, after identification with the NFC protocol, the user may be requested to point with his mobile device to a selected area.

In an embodiment as claimed in claim 7, after a communication is established with the near field communication unit, the controller determines a pointing direction of the mobile device based on the position of the near field communication unit and the orientation of the mobile device. It has to be noted that the (relative) position of the communication terminal and the output device are system defined and thus known. Because, at least at the start of the communication, the mobile device must be positioned near the communication terminal, also the position of the mobile device with respect to the output device is known. Consequently, it is not required to know the position of the mobile device, it suffices to know its angle with respect to the earth's surface, or with respect to the earth's magnetic field, or with respect to the earth's gravity field, or any combination thereof. This embodiment has the advantage that the mobile device must not be able to sense its position or need to communicate its position to the communication terminal.

In an embodiment as claimed in claim 8, the host system further comprises a further communication unit for communication with the mobile device A range of the further communication unit is larger than the range of the near field communication unit. This further communication unit may provide Bluetooth, WIFI or any other wireless communication over a larger distance that the NFC is able to perform. This has the advantage that after the NFC communication has been initiated; the communication between the mobile device and the host system may be taken over by a longer range communication allowing the user to move more freely. It seems cumbersome for the user to keep the mobile device for a long time in the direct neighborhood of the communication terminal such that the NFC is kept intact. People tend to move their arms to a resting position after the NFC is initiated. Communication is now upheld by using the further communication unit. This further communication unit may or may not be located in the communication terminal. It is possible to use a single further communication unit to support multiple NFC communication terminals. The longer range communication may be established after is detected that the NFC is not anymore possible. But this may have the disadvantage that the communication is interrupted.

In an embodiment as claimed in claim 9, the controller switches over the communication between the host system and the mobile device from the near field communication unit to the further communication unit after a near field communication is established. Thus, at least during the start of the NFC, the mobile device needs to be sufficiently near to the communication terminal. Once the communication is started the longer range communication takes over the NFC. Consequently, the longer range communication takes over the NFC almost immediately after the NFC started. Thus, when the mobile device moves away from the communication terminal and NFC is not anymore possible, the communication goes on because it was already active via the longer range communication unit.

In an embodiment as claimed in claim 10, the controller monitors the average orientation of the mobile device while changing its position during pointing towards the output device. For example, if a user moves the mobile device away from the communication terminal while still pointing at the same area of the displayed image, it is assumed that the user is still pointing to the same location of the displayed image. Consequently, it is possible to correct for the changing orientation of the mobile device due to its position change. This monitoring mode may be activated when is detected that the NFC is not anymore possible.

In an embodiment, the mobile device comprises an electronic compass as the orientation sensor which indicates the orientation of the mobile device. An example of such an electronic compass is the "TruePoint Compass Module" of Honeywell, California, USA. A description of this electronic compass can be found on the website of Honeywell. The TruePoint Compass Module is a true three axis digital compass module that can be used in any orientation. Data from the three silicon magnetometers and three accelerometers are combined to provide compass azimuth as well as pith and roll angle. Consequently, such a compass is well fitted to provide the orientation information. Another example of such an electronic compass comprises a three-dimensional accelerometer and a two dimensional magnetometer as disclosed in PCT/IB2006/051314 (PH000613) and PCT/IB2006/051317 (PH000319). The fact that the magnetometer need not be a three-dimensional sensor but can be a two-dimensional sensor is a great advantage, owing to the fact that two-dimensional magnetometer sensors can be produced easier and at lower costs and can be more durable and of a smaller size. These and other aspects of the invention are apparent from and will be elucidated with reference to the embodiments described hereinafter.

The invention relates in another aspect to a mobile device which is suitable for use in a system as claimed in claim 1. According to the invention, the mobile device comprises means for near field communication with a near field communication unit. The mobile device further comprises an electronic compass which indicates an orientation of the mobile device. The electronic compass is herein calibrated with the help of position data obtained from the near field communication unit. This calibration may be achieved by the controller in the system. Thereafter, information to the user may be provided on any output device of the mobile device itself. The user may send information to the system with any input means of his mobile device. Such information is advantageously communicated between communication terminal and mobile device using another communication protocol, such as W-LAN, Bluethooth and the like.

In the drawings:

Fig. 1 schematically shows a setup comprising communication terminals for NFC, mobile devices which communicate with the communication terminals to control output devices to provide output dependent on an orientation of the mobile devices, and Fig. 2 shows a block diagram of a mobile device and a communication terminal.

It should be noted that items which have the same reference numbers in different Figures, have the same structural features and the same functions, or are the same signals. Where the function and/or structure of such an item has been explained, there is no necessity for repeated explanation thereof in the detailed description.

Fig. 1 schematically shows a setup comprising communication terminals for NFC, mobile devices which communicate with the communication terminals to control output devices to provide output dependent on an orientation of the mobile devices. The system shown comprises three communication terminals CTl, CT2, and CT3. In the example shown, the output devices comprise a display apparatus ODl with a display screen on which the display information DIl is displayed, a projector (not shown) which projects the display information DI2 on a wall of a room, and two speakers OD2 and OD3. Alternatively, the display could be formed by a video-wall comprising several display screens or projectors which together create a large image. For example, the display information may be sales articles, advertisements, music play lists, polls, game input, and museum information. A mobile device MDl, MD2, MD3 is in NFC (near field communication) with the communication terminal CTl, CT2, CT3, respectively. A first system comprises the communication terminals CTl, CT2, and the output devices ODl, OD2, OD3. A second system comprises the communication terminal CT3 and the projector. In both systems the relative positions of the communication terminals CTl to CT3 and the output devices ODl to OD4, and thus the positions of the displayed information DIl, DI2 are known. In the first system, a first user holding the mobile device MDl points with the mobile device to a position on the screen of the display apparatus ODl which is indicated by a marker 30. Because the mobile devices MDl, MD2 are near to the communication terminals CTl, CT2, respectively, their positions with respect to the displayed information DIl is well known. It suffices to communicate the orientation of the mobile devices MDl, MD2 to the communication terminals CTl, CT2, respectively. From the communicated orientation and the assumed position of the mobile devices MDl, MD2, the communication terminals CTl, CT2 are able to determine where the mobile devices MDl, MD2 are pointing at. The communication terminals CTl, CT2 insert a marker 30, 31 in the display information DI (see Fig. 2) to display the marker 30, 31 in the displayed information DIl at the respective positions the mobile devices MDl, MD2 are pointing at.

These markers 30, 31 are used as a feedback to the user where he is pointing at. A particular action may be taken for example by pointing longer than a predetermined period of time to a same area. For example, if the marker is within a displayed button, the action defined by this button will be performed. Alternatively, the user may press a button on the mobile device MDl to indicate that the action related to the area pointed at should be performed.

In the same manner, the user may point the mobile device MDl to one of the loudspeakers OD2, OD3 to indicate that the sound should start or stop playing.

It is not essential that the two mobile devices MDl, MD2 are each communicating with associated respective communication terminals CTl, CT2 which are at different positions. More than one NFC transceiver may be present in the same communication terminal CTl, or two or more mobile devices MDl, MD2 can be sufficiently near to a single NFC transceiver and thus have NFC with the single communication terminal CTl. A single communication terminal CTl may interact with multiple output devices. Although very intuitive, it is not required that the mobile device MDl should point towards the position at which the marker should be inserted. For example, it might be assumed that independent on the orientation of the mobile device MDl when establishing the NFC the marker must be at a predetermined position on the screen. All changes of the orientation of the mobile device MDl with respect to its orientation when establishing NFC will move the marker in the indicated direction.

In the second system, the projector projects the display information DI2 which shows buttons Bl to B4. The communication terminal CT3 determines the position where the mobile device MD3 is pointing at and highlights one of the buttons Bl to B4 if the position where the mobile device MD3 is pointing at is within the area covered by this one of the buttons Bl to B4. Again, although very intuitive, it is not required that the mobile device MD3 should point towards the position at which the button should be highlighted. For example, it might be assumed that independent on the orientation of the mobile device MD3 when establishing the NFC the button Bl is highlighted. All changes of the orientation of the mobile device MD3 will be interpreted as starting from a center of the button Bl .

Fig. 2 shows a block diagram of a mobile device and the host system. The mobile device MDl comprises a near field communication unit 20, a communication unit 21, an orientation sensor 22 and personal identifier storage 23. The host system HS comprises a near field communication unit 10, a communication unit 12, and a controller 11. Although in Fig. 2 is shown that the near field communication unit 10, the communication unit 12, and the controller 11 are physically located in the communication terminal CTl, other configuration are possible wherein the controller 11 and/or the communication unit 12 are not physically located in the communication terminal CTl. As indicated in Figure 1, the host system may support multiple near field communication units 10, located either in the same communication terminal CTl, or in different communication terminals CTl, CT2, CT3. It is assumed, but not necessary, that a single communication unit 12 and a single controller 11 are sufficient for the host system to be able to communicate with different mobile phones that may be logged onto the host system.

The near field communication unit 20 of the mobile device MDl, which for example is a mobile phone or a PDA, provides a near field communication path CPl with the near field communication unit 10 of the communication terminal CTl. Usually, the near field communication path CPl is activated automatically when the mobile device MDl is moved sufficiently near to the communication terminal such that both elements are in each others near field communication range. Usually, this relatively short near field range is in the order of centimeters, for example 20 cm. Optionally, the mobile device MDl may comprise a longer range communication unit 21. This communication unit 21 together with the optional communication unit 12 in the communication terminal CTl provides a communication path CP2 with a longer range than the communication path CPl. The addition of the longer range communication path CP2 has the advantage that once the communication is started via the near field communication path CPl by simply bringing the mobile device MDl sufficiently close to the communication terminal CTl, the communication can be upheld via the communication path CP2 even when the mobile device MDl moves out of the range of the near field communication. For example, the communication path CP2 may be provided according the Bluetooth or WiFi standard. However, any wireless communication protocol can be implemented.

The orientation sensor 22 provides orientation information SD on the orientation of the mobile device MDl to the relatively short range near field communication unit 20 and/or the longer range communication unit 21. The orientation information SD may be provided in many ways. For example, the heading and/or vertical tilt may be provided, or a vector defined by three values in an orthogonal coordinate system related to the earth's gravity and magnetic field may be provided. The combination of azimuth and the elevation information allows for pointing with two degrees of freedom, e.g. on a display represented by two-dimensional X-Y coordinate frame. The orientation information SD may be raw or averaged sensor data. It suffices that the mobile device MDl supplies orientation information only, it is not required that the orientation information SD provides the position of the mobile device MDl. The position is implicitly known when starting the near field communication.

The optional personal identifier storage 23 stores a unique identifier PI which uniquely identifies the mobile device MDl. By forwarding this unique identifier PI to the host system via the near field communication path CPl and/or the communication path CP2, the host system is aware with which mobile device MDl it is communicating.

The near field communication unit 10 supplies the received orientation information Oil to the controller 11. The longer range communication unit 12 supplies the received orientation information OI2 to the controller 11. Usually, the received orientation information Oil and OI2 is identical to the orientation information SD. However the received orientation information Oil and OI2 may be a coded version of the orientation information SD which has to be decoded by the controller 11. The coding of the orientation information may be required for optimal communication via the communication paths CPl and CP2. Dependent on the mode of use of the mobile device MDl, either the communication unit 10 or the communication unit 12 may be active, or both communication units 10 and 12 are active. During start up, when the mobile device MDl is entering near field communication with the communication terminal CTl, the near field communication unit 10 should be active until the near field communication path CPl is active. Then, the communication units 12 and 21 may become active to take over the communication. It is also possible to activate the communication units 12 and 21 only when is detected that the signal strength in the near field communication path CPl is below a predetermined value or when is detected that the mobile device MDl is out of the near field communication range by a breakdown of the communication path. The controller 11 controls the on/off switching of the communication units 10 and 12 with the control signals CSl, CS2, respectively. The NFC is used to authenticate the mobile device or in fact the mobile device user, after which a unique, preferably secure data connection, for example a Bluetooth connection, is setup. Alternatively, the NFC unit 10 may be continuously active to be able to detect any approaching mobile devices MDl. The controller 11 generates display information DI which is supplied to the output device ODl to display the display information as displayed information DIl. For example, the display information DIl may be a computer generated image, or a photograph or film.

The controller 11 uses the received orientation information Oil or OI2 to identify the pointing position indicated by the mobile device MDl . For example, the identification of the pointing position may be a marker 30 at the position in the displayed information DIl. For example, if an image is show of a group of animals, the user may manipulate the orientation of the mobile device MDl such that the marker 30 is on an animal of which the user would like to have more information. If the marker 30 is held longer than a predetermined time on this animal, a next image is shown giving more details on this animal. Instead of holding the marker 30 a predetermined time on a same object, the mobile device MDl may have user selector, such as for example a button, which when activated by the user indicates that a certain object is selected. Alternatively, the image may have predefined areas, such as for example buttons, which are highlighted by the controller 11 when is detected from the received orientation information Oil, OI2 that the mobile device is hold in an orientation which indicates one of the predefined areas.

In an intuitive embodiment, the mobile device MDl has an elongated form with an axis which is defined as the pointing direction of the mobile device MDl. The position pointed at in the displayed information DIl is the intersection of this axis and the displayed information DIl.

In an alternative setup, the controller 11 of the host system only provides the position in the displayed information DIl to which is pointed by the mobile device MDl. The indication CS3 of this position in the displayed information DIl is performed for example in the output device ODl . In this setup, the controller 11 need not supply the information to be displayed.

The system may operate according to the following three step protocol. In the first step, the user keeps the mobile device MDl in the immediate vicinity of the NFC unit 10 of the communication terminal CTl to start a log-on protocol.

The host system authenticates the mobile user by means of his mobile device MDl which, for example, is a mobile phone. During the authentication phase, the communication terminal initiates a secure and unique communication path CP2 between the communication terminal CTl and the mobile device MDl. User interaction may be required during authentication, for example, it might be required to input a password or a user number. Once the user is logged- on, the communication terminal can retrieve necessary information either from data collected in the mobile device MDl, or stored in the communication terminal, or retrieved for example from internet to allow further options such as for example billing in case of sales items, or personalization of the displayed information DIl. In the optional second step the communication path is changed from the NFC

CPl path to the longer range communication path CP2. Such a transfer from NFC to, for example, Bluetooth communication can be performed seamlessly, in a transparent way for the user. The user need not be actively involved in this communication path transfer.

In a third step, the pointing function is activated. The electronic compass may comprise a 2D sensor and has to be kept horizontal to get a correct heading or azimuth information, or may comprise a tilt compensated 3D sensor which provides azimuth and elevation information. It is assumed from the known position of the communication terminal CTl in the system that the position of the mobile device MDl is identical to the position of the communication terminal CTl. The mobile device may further comprise a GPS receiver to determine its position. This is, for example, especially relevant if the position of the communication terminal CTl is not known in the system, or if the mobile device MDl is moved away from the communication terminal CTl and a high accuracy of the actual position of the mobile device MDl is relevant. To improve the correct positioning of the mobile device MDl, an extra indication can be given to the user to make clear where he should held the mobile device MDl, for example by a marked region on the floor, for example, close to the communication terminal CTl. Alternatively, the position of the mobile device MDl may be estimated by the system by monitoring the average pointing direction during movement the mobile device MDl while pointing at substantially the same position, area or output device. However, if the user wants to change between output devices, e.g. from a first output device to a second output device, this should be detected by the system, otherwise a change in orientation of the mobile device MDl may still be interpreted as a change in the pointed at position in the first output device. Such an automatic detection may be obtained if the user holds its mobile device MDl in the vicinity of a communication terminal CTl, CT2, CT3 to establish a NFC link. Alternatively, the user may indicate such an event by activating a particular user input, such as a button. Monitoring a (moving) average pointing direction is only valid for a single output device, or in a more relaxed condition, for an ensemble of spatially disconnected output devices over a limited solid angle. In this case, the host system provides a pointer function for the multitude of output devices. An example would be a collection of displays on a wall, whereby the user can seamlessly roam over the different displays while pointing with his mobile phone, including the concept of monitoring the average pointing direction.

A user may interact with the output device ODl in the system by pointing towards it. If the output device provides displayed information DIl, the pointed at position can be visualized. A button of the mobile device MDl can be used as a click function.

Alternatively, tilt or rotation around the pointing axis of the mobile device MDl may be used as an analog or a digital input. The pointing function need not be limited to a single output device ODl, but may have a spatial coverage including multiple output devices ODl, OD2, OD3. The position of the elements of the system has to be known. All positional information may be predefined in the controller 11 or may be entered in the controller 11 in a system calibration step. Preferably, the positional information is stored in a database format in the controller 11. The pointed at output device, or a pointed at area of an output device can be calculated by using the collection of positional information of the elements of the system, the assumed or real position of the mobile device MDl, and the orientation of the mobile device MDl.

Alternatively, a calibration procedure can be performed using a dedicated pointing device from at least one reference position to allow filling or update of the database of calibrated pointing directions of the output devices ODl to OD3 with the position of the pointing device as origin. The combination of at least two reference positions and corresponding pointing directions of the pointing device enable the system to calculate the 3D location of the output devices ODl to OD3 relative to the pointing device by using a triangulation or multi-angulation principle. Because the initial user location is implicitly provided by the position of the NFC terminal CTl when the user is logging on, no additional calibration by the user is required during log-on.

The use of the mobile device MDl which has a pointer function has the advantage that a single NFC terminal CTl allows the user to select between several items of the displayed information DIl. The pointing approach is easier to use than a numbered input. However, numbered input can be provided simultaneously, for example for mobile devices MDl which do not have the pointing function. The pointing function allows for more design flexibility of the system because the mobile device MDl points at the displayed information ODl without the need for close contact with the area where the displayed information ODl is displayed. Furthermore thanks to the increased distance between the area where the displayed information ODl is displayed and the user, only the communication terminal CTl need to be vandalism proof.

It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. Use of the verb "comprise" and its conjugations does not exclude the presence of elements or steps other than those stated in a claim. The article "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims

CLAIMS:
1. A system comprising: a communication terminal (CTl) having a near field communication unit (10) for near field communication with a mobile device (MDl) when in range of the near field communication unit (10), - an output device (ODl; OD2) positioned outside the range of the near field communication unit (10) for providing output to a user of the mobile device (MDl), and a controller (11) for controlling the output device (ODl; OD2) to provide the output dependent on an orientation (Dl) of the mobile device (MDl).
2. A system as claimed in claim 1, wherein the output device (ODl; OD2) is a display apparatus (ODl) for displaying display information (DI) as displayed information (DIl), and wherein the controller (11) is arranged for receiving via the communication terminal (CTl) orientation information (SD) on the orientation (Dl) of the mobile device (MDl), and is constructed for generating a visual indicator (30) indicating a position in the displayed information (DIl) dependent on the orientation (Dl) of the mobile device (MDl).
3. A system as claimed in claim 2, wherein the visual indicator (30) indicates a position in the displayed information (DIl) defined by a pointing direction (Dl) of the mobile device (MDl) to a particular area of the displayed information.
4. A system as claimed in claims 2 or 3, wherein the controller (11) is constructed for retrieving a personal identifier (PI) of the mobile device (MDl), and for allocating the visual indicator (30) having an appearance to uniquely distinct it from a further visual indicator (31) associated with another mobile device (MD2) communicating with the communication terminal (CTl) or with another communication terminal (CT2) of the system.
5. A system as claimed in claim 1, wherein the output device (ODl; OD2) is a sound transducer (OD2), and wherein the controller (11) is arranged for receiving from the communication terminal (CTl) information on the orientation (Dl) of the mobile device (MDl), and is constructed for activating the sound transducer (OD2) dependent on the orientation (Dl) of the mobile device (MDl).
6. A system as claimed in claim 4, wherein the controller (11) is constructed for switching on or off the sound transducer (OD2) when the mobile device (MDl) is pointing towards the sound transducer (OD2).
7. A system as claimed in claim 3 or 6, wherein the controller (11) is constructed for determining a pointing direction (Dl) of the mobile device (MDl) when a communication is established with the near field communication unit (10) based on the position of the near field communication unit (10) and the orientation of the mobile device (MDl).
8. A system as claimed in any one of the preceding claims, further comprising a further communication unit (12) for communication with the mobile device (MDl), a range of the further communication unit (12) being larger than the range of the near field communication unit (10).
9. A system as claimed in claim 8, wherein the controller (11) is constructed for switching over communication with the mobile device (MDl) from the near field communication unit (10) to the further communication unit (12) after a near field communication is established.
10. A system as claimed in claim 9, wherein the controller (11) is constructed for estimating a changing position of the mobile device (MDl) by monitoring an average orientation or an average pointing direction of the mobile device (MDl) while changing its position during pointing towards the output device (ODl).
11. A system as claimed in any one of the claims 1 to 10, further comprising the mobile device (MDl) having a near field communication unit (20) for near field communication with the near field communication unit (10) of the communication terminal (CTl).
12. A system as claimed in claim 11 when dependent on claim 8, wherein the mobile device (MDl) further comprises a further communication unit (21) for communication with the further communication unit (12), a range of the further communication unit (21) of the mobile device (MDl) being larger than the range of the near field communication unit (20) of the mobile device (MDl).
13. A system as claimed in claim 11 or 12, wherein the mobile device (MDl) comprises an orientation sensor (22) for indicating the orientation (Dl) of the mobile device (MDl).
14. A system as claimed in claim 13, wherein the orientation sensor (22) is an electronic compass.
15. A system as claimed in any one of the claims 11 to 14, wherein the mobile device is or comprises a mobile phone or a PDA.
16. A method of communicating in a system comprising a communication terminal (CTl) having a near field communication unit (10) for near field communication with a mobile device (MDl) when in range of the near field communication unit, and an output device (ODl) positioned outside the range of the near field communication unit (10) for providing output to a user of the mobile device (MDl), the method comprising controlling (11) the output device (ODl) to provide the output dependent on an orientation (01) of the mobile device (ODl).
17. A mobile device suitable for use in a system as claimed in claim 1, comprising: - means for near field communication with a near field communication unit of a communication terminal when in the range of the near field communication unit; an orientation sensor for indicating the orientation of the mobile device.
PCT/IB2007/052293 2006-06-23 2007-06-15 Nfc enabled pointing with a mobile device WO2008007260A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP06115937 2006-06-23
EP06115937.2 2006-06-23

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP07825820A EP2078238A2 (en) 2006-06-23 2007-06-15 Nfc enabled pointing with a mobile device

Publications (2)

Publication Number Publication Date
WO2008007260A2 true WO2008007260A2 (en) 2008-01-17
WO2008007260A3 WO2008007260A3 (en) 2008-05-15

Family

ID=38923621

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2007/052293 WO2008007260A2 (en) 2006-06-23 2007-06-15 Nfc enabled pointing with a mobile device

Country Status (5)

Country Link
EP (1) EP2078238A2 (en)
KR (1) KR20090023735A (en)
CN (1) CN101479689A (en)
TW (1) TW200810395A (en)
WO (1) WO2008007260A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009141764A2 (en) * 2008-05-19 2009-11-26 Nxp B.V. Nfc mobile communication device and nfc reader
US8068011B1 (en) 2010-08-27 2011-11-29 Q Street, LLC System and method for interactive user-directed interfacing between handheld devices and RFID media
US8200246B2 (en) 2008-06-19 2012-06-12 Microsoft Corporation Data synchronization for devices supporting direction-based services
US8243097B2 (en) 2009-10-21 2012-08-14 Apple Inc. Electronic sighting compass
US8467991B2 (en) 2008-06-20 2013-06-18 Microsoft Corporation Data services based on gesture and location information of device
US8700301B2 (en) 2008-06-19 2014-04-15 Microsoft Corporation Mobile computing devices, architecture and user interfaces based on dynamic direction information
CN103970263A (en) * 2013-02-04 2014-08-06 原相科技股份有限公司 Wireless peripheral device with multi-transmission capability
US9661468B2 (en) 2009-07-07 2017-05-23 Microsoft Technology Licensing, Llc System and method for converting gestures into digital graffiti

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294841A (en) * 2012-03-05 2013-09-11 北京千橡网景科技发展有限公司 Computing method and system for information collecting
TWI561022B (en) * 2012-12-17 2016-12-01 Hon Hai Prec Ind Co Ltd Nfc terminal and power saving method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030063065A1 (en) * 2001-09-11 2003-04-03 Samsung Electronics Co., Ltd. Pointer control method, pointing apparatus, and host apparatus therefor
US20040095317A1 (en) * 2002-11-20 2004-05-20 Jingxi Zhang Method and apparatus of universal remote pointing control for home entertainment system and computer
EP1501038A1 (en) * 2003-07-22 2005-01-26 Sony Corporation Communication apparatus
US20050270494A1 (en) * 2004-05-28 2005-12-08 Banning Erik J Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030063065A1 (en) * 2001-09-11 2003-04-03 Samsung Electronics Co., Ltd. Pointer control method, pointing apparatus, and host apparatus therefor
US20040095317A1 (en) * 2002-11-20 2004-05-20 Jingxi Zhang Method and apparatus of universal remote pointing control for home entertainment system and computer
EP1501038A1 (en) * 2003-07-22 2005-01-26 Sony Corporation Communication apparatus
US20050270494A1 (en) * 2004-05-28 2005-12-08 Banning Erik J Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8862052B2 (en) 2008-05-19 2014-10-14 Nxp, B.V. NFC mobile communication device and NFC reader
WO2009141764A3 (en) * 2008-05-19 2010-04-22 Nxp B.V. Nfc mobile communication device and nfc reader
CN102037499A (en) * 2008-05-19 2011-04-27 Nxp股份有限公司 NFC mobile communication device and NFC reader
US9607192B2 (en) 2008-05-19 2017-03-28 Nxp B.V. MIFARE push
WO2009141764A2 (en) * 2008-05-19 2009-11-26 Nxp B.V. Nfc mobile communication device and nfc reader
US8200246B2 (en) 2008-06-19 2012-06-12 Microsoft Corporation Data synchronization for devices supporting direction-based services
US9200901B2 (en) 2008-06-19 2015-12-01 Microsoft Technology Licensing, Llc Predictive services for devices supporting dynamic direction information
US8700301B2 (en) 2008-06-19 2014-04-15 Microsoft Corporation Mobile computing devices, architecture and user interfaces based on dynamic direction information
US8700302B2 (en) 2008-06-19 2014-04-15 Microsoft Corporation Mobile computing devices, architecture and user interfaces based on dynamic direction information
US10057724B2 (en) 2008-06-19 2018-08-21 Microsoft Technology Licensing, Llc Predictive services for devices supporting dynamic direction information
US8467991B2 (en) 2008-06-20 2013-06-18 Microsoft Corporation Data services based on gesture and location information of device
US9703385B2 (en) 2008-06-20 2017-07-11 Microsoft Technology Licensing, Llc Data services based on gesture and location information of device
US8868374B2 (en) 2008-06-20 2014-10-21 Microsoft Corporation Data services based on gesture and location information of device
US9661468B2 (en) 2009-07-07 2017-05-23 Microsoft Technology Licensing, Llc System and method for converting gestures into digital graffiti
US8243097B2 (en) 2009-10-21 2012-08-14 Apple Inc. Electronic sighting compass
US8395486B2 (en) 2010-08-27 2013-03-12 Q Street, LLC System and method for interactive user-directed interfacing between handheld devices and RFID media
US8068011B1 (en) 2010-08-27 2011-11-29 Q Street, LLC System and method for interactive user-directed interfacing between handheld devices and RFID media
US9858455B2 (en) 2010-08-27 2018-01-02 Q Street, LLC System and method for interactive user-directed interfacing between handheld devices and RFID media
CN103970263A (en) * 2013-02-04 2014-08-06 原相科技股份有限公司 Wireless peripheral device with multi-transmission capability

Also Published As

Publication number Publication date
TW200810395A (en) 2008-02-16
CN101479689A (en) 2009-07-08
EP2078238A2 (en) 2009-07-15
KR20090023735A (en) 2009-03-05
WO2008007260A3 (en) 2008-05-15

Similar Documents

Publication Publication Date Title
KR101637990B1 (en) Spatially correlated rendering of three-dimensional content on display components having arbitrary positions
CN103853328B (en) Screen lock control method and a mobile terminal
US8624725B1 (en) Enhanced guidance for electronic devices having multiple tracking modes
US8732319B2 (en) Context awareness proximity-based establishment of wireless communication connection
US8914232B2 (en) Systems, apparatus and methods for delivery of location-oriented information
JP5398915B2 (en) For the file sharing function, the device moves user interface gesture
EP2385700A2 (en) Mobile terminal and operating method thereof
US20130204408A1 (en) System for controlling home automation system using body movements
US9262913B2 (en) Communication device
US9363010B2 (en) Mobile terminal and method of controlling function of the mobile terminal
KR100580648B1 (en) Method and apparatus for controlling devices using 3D pointing
EP2753104B1 (en) Information-providing method and mobile terminal therefore.
US20060044265A1 (en) HMD information apparatus and method of operation thereof
CN102301738B (en) Communication means
US8952779B2 (en) Portable terminal, method, and program of changing user interface
US7091997B2 (en) One-to-one direct communication
KR101465906B1 (en) Methods and apparatuses for gesture based remote control
EP2774317B1 (en) Apparatus and method for controlling controllable device in portable terminal
CN105191172B (en) Communication method and apparatus
CN104221077B (en) Head-mounted display
US20130189925A1 (en) Pairing Wireless Device Using Multiple Modalities
KR101547040B1 (en) Non-map-based mobile interface
US7158006B2 (en) Mobile communication terminal for controlling a vehicle using a short message and method for controlling the same
US7139562B2 (en) Remote control device
US20130222266A1 (en) Method and apparatus for interconnected devices

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780023460.4

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2007825820

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2009516036

Country of ref document: JP

NENP Non-entry into the national phase in:

Ref country code: DE

NENP Non-entry into the national phase in:

Ref country code: RU

WWE Wipo information: entry into national phase

Ref document number: 1020097001516

Country of ref document: KR