WO2010131067A1 - Ubiquitous interface device split - Google Patents

Ubiquitous interface device split Download PDF

Info

Publication number
WO2010131067A1
WO2010131067A1 PCT/IB2009/006047 IB2009006047W WO2010131067A1 WO 2010131067 A1 WO2010131067 A1 WO 2010131067A1 IB 2009006047 W IB2009006047 W IB 2009006047W WO 2010131067 A1 WO2010131067 A1 WO 2010131067A1
Authority
WO
WIPO (PCT)
Prior art keywords
hid
duc
user
controller
output
Prior art date
Application number
PCT/IB2009/006047
Other languages
French (fr)
Inventor
Mark Doll
Wolfgang Templ
Michael Tangemann
Original Assignee
Alcatel Lucent
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alcatel Lucent filed Critical Alcatel Lucent
Priority to PCT/IB2009/006047 priority Critical patent/WO2010131067A1/en
Publication of WO2010131067A1 publication Critical patent/WO2010131067A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0383Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0384Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface

Definitions

  • the invention relates to man-machine interfaces.
  • the invention relates to a man-machine interface which provides a ubiquitous communication between a user and a plurality of controlled devices.
  • human interfaces typically provide input capabilities which allow a user to provide information and/or control data to a so-called device under control (DUC), e.g. a TV receiver or a DVD player, as well as output capabilities which provide feedback or recommendations from the DUC to the user.
  • DUC device under control
  • Examples for input interfaces comprise "learning remote controls" which are programmed either manually or automatically by receiving from the device under control the DUCs infrared or Bluetooth "command to function" mappings which may eventually be accompanied by a suitable button layout for the remote control.
  • Another example for such input interfaces are configurable user interfaces which are based e.g. on soft keys.
  • Examples for output interfaces are feedback screens on a TV receiver or beep tones from a mobile telephone.
  • DUCs device under control
  • the interface should be applicable to prac- tically all device control tasks, e. g. to control machines, one's PC/Desktop, consumer electronics or appliances in a smart home.
  • human interfaces which are also referred to as human interface devices (HID) are gesture recognizing wrist bands, collars detecting sub- vocalization, retina projectors, in-ear speakers, gaze detecting barrettes or even brain computer interfaces.
  • HIDs are particularly well suited for the design of wearable unobtrusive wireless interfaces.
  • the present patent description provides a solution to the above mentioned problems known from the prior art.
  • a basic idea of this solution is to split the interface, i.e. the HID, from the device under control, i.e. the DUC. By doing this, the interface can be permanently customized and adapted to the user's preferences, habits, capabilities etc.
  • middle ware which handles the communication between the interface and the device, it is possi- ble to allow arbitrary interfaces to control arbitrary devices.
  • middle ware In order to allow for a large variety of HIDs to communicate and control a large variety of DL)Cs, it is preferably that such middle ware be standardized.
  • a first system providing ubiquitous interfacing between a user and a device under control (DUC) is described.
  • the system comprises a human interface device (HID) operable to detect a user input from the user and to transmit the user input to a HID controller.
  • the HID senses the user input without performing additional processing such as a translation of the user input into a command which is executable by the DUC.
  • the HID does not perform an interpretation of the user input, notably an interpretation with regards to an intended DUC command.
  • HIDs or HID components are a wearable unobtrusive wireless interface, a microphone, a speaker, a display, a keyboard, a mouse, a gesture recognizing wrist band, a collar detecting sub-vocalization, a retina projector, an in-ear speaker, a gaze detecting barrette, a body area network comprising a common relay device for communication with the HID controller, a Graphical User Interface executed on an electronic device, a smart phone or a brain computer interface.
  • DUCs are e.g. machines, personal computers, consumer electronics and smart home components, such as light dimmers, electrical stores, kitchen appliances, etc..
  • an HID may be implemented as a graphical user interface (GUI) on a smart phone or a consumer device such as a mp3 player.
  • GUI graphical user interface
  • Such a graphical user interface may be used to select a certain DUC, e.g. a stereo system to which a user input is directed.
  • the GUI provides a set of possible user inputs which are predefined by the user and which correspond to the user's input preferences.
  • a volume control may be represented as a rotary button that is turned on the touch screen of the smart phone in order to increase the volume.
  • the same rotary button may also be shown in conjunction with the control of light dimmers in a room or with the control of the volume of a TV receiver.
  • the rotary button may be the preferred user input to indicate a smooth increase/decrease of the controlled value.
  • such user input will control a different parameter or capability of the DUC.
  • the first system comprises further the HID controller which is physically separate from the HID and the DUC.
  • the HID controller is not co- located with the HID and the DUC, but preferably situated at a home server or a base station.
  • the HID controller may be implemented on a residential base station such as a WLAN access point or a femto base sta- tion.
  • access points are standardized HW (hardware) platforms or standardized execution environments so that the HID controller may be implemented as HID software running on such standardized HW platform.
  • the HID controller is operable to receive the user input from the HID, relate and translate the user input to a corresponding DUC command and transmit the DUC command to the DUC for execution.
  • the HID controller performs the interpretation of the bare user input and identifies the meaning of the user input which was intended by the user. Based on such interpretation a DUC command is generated and passed to the DUC.
  • the generation of the DUC command may be done by mapping the user input to a DUC command using a correspondence table.
  • a user input signaling a partial turn of the button may be received at the HID controller.
  • the HID controller may have knowledge, e.g.
  • the HID controller maps the user input, i.e. the partial turn of the but- ton, to a DUC command, i.e. to a command indicating to the stereo system that the volume is to be increased by a certain amount. If, on the other hand, the HID controller is informed that the user input relates to another DUC, e.g. the light dimmer, then the HID controller translates the user input into a command instructing the light dimmer to increase the light by a certain amount.
  • the first system may also comprise the DUC which is controlled.
  • the DUC is operable to receive the DUC command and to execute the DUC command.
  • the system may comprise elements of the system providing DUC output to a user, as outlined below.
  • the DUC, the HID and the HID controller are physically separate.
  • the HID, the HID controller and the DUC may be connected via a communication infrastructure, such as the Internet.
  • a portable/wearable HID is typically connected via a wireless access infrastructure, such as a WLAN, a Bluetooth, a WiMAX, or an IEEE 802.16m network, while a stationary HID, e.g. a web camera used for gesture recognition, may be wire-bounded.
  • a second system providing ubi- quitous interfacing between a DUC and a user is described.
  • the system provides a DUC output to the user in a way that is in accordance with user preferences.
  • this system is provided in combination with the system of providing a user input to a DUC, as outlined above.
  • the second system comprises an HID controller which is physically separate from an HID and the DUC and which is operable to receive a DUC output from the DUC, to determine a user output corresponding to the DUC output, to translate the DUC output into the corresponding user output and to transmit the user output to the HID.
  • the HID controller arranges/structures and transforms the DUC output according to the users' pre- ferences.
  • the user may wish to receive the feedback from a DUC, e.g. his stereo system or TV receiver, on the touch screen of his smart phone.
  • ATV receiver might send interactive menu information to the user which is transformed by the HID controller into a user output adapted for output on the smart phone of the user.
  • Such user output may be based on certain graphical user interface adapted to the users' preferences.
  • the second system comprises further the HID which is operable to receive the user output and to render the user output to the user.
  • the smart phone is able to provide the user output to the user, e.g. by displaying the interactive TV receiver menu on the touch screen, or by providing sound alerts, etc.
  • the system may further comprise the DUC which is operable to generate the DUC output and to transmit the DUC output to the HID controller.
  • the system may also be provided in combination with a system for providing user input to a DUC.
  • the interactive menu displayed on the touch screen on the smart phone may comprise soft buttons which are used to control the DUC.
  • the above mentioned systems may further comprise a storage entity which is operable to store a user profile comprising a mapping between the user input and the corresponding DUC command and/or a mapping between the DUC output and the corresponding user output.
  • the systems may comprise a storage entity which provides a central repository for the translation functionality of the HID controller.
  • the storage entity may also be used as a back-up system for the HID controller.
  • the storage entity may store user profiles which are used by the HID controller to perform the translation between user preferred input/output and DUC command/outputs. It may also store the user specific HID software which comprises the user profiles and also the specific logic for communication with the HID.
  • the central repository functionality of the storage entity may be used to provide the user profile and/or the HID software to various different HID controllers and/or HW platforms/execution environments.
  • Such different HID control- ler or HW platform may be operable to download the user profile or HID software from the storage entity and perform the translation based on the user profile or by executing the HID software.
  • any standardized HW platform e.g. a base station, may be transformed into a user specific HID controller, thereby allowing a user to connect his HID to an arbitrary base station.
  • network coverage for a user's HID may be provided.
  • the systems comprise a plurality of HID controllers or a plurality of standardized HW platforms which may take on the function of a user and/or HID specific HID controller by downloading the user profile and/or HID software from a central repository.
  • the HID is operable to register with a first of the plurality of HID controllers in order to transmit the user input or receive the user output.
  • the HID may select an appropriate HID controller which is within wireless reach and establish a communication with this HID controller, in order to send a user input to a DUC or to receive a user output from the DUC.
  • the HID may register with the HID controller that provides the best reactivity of the HID to the user, e.g. the HID controller that has the lowest latency for communication between the HID and the HID controller and/or that has the highest available computing power.
  • the network of HID controllers or standardized HW platforms/ standardized execution environments may also provide seamless handover when an HID moves from one HID controller to another HID controller. As the HID moves from one location to another, the HID is operable to register with a second of the plurality of HID controllers and the first and second HID controllers are operable to perform a smooth/seamless handover.
  • Such seamless handover from the first HID controller to the second HID controller may be implemented by passing the user profile and/or the HID software from the first base station to the second base station and thereby provide a copy of the first HID controller at the second location.
  • the handover may also be implemented by making use of the storage entity which comprises a back-up copy of the user profile and/or HID software.
  • the second base station may download the HID software from the storage entity and thereby prepare the second HID control- ler. By doing this, only a reduced amount of data, e.g. only the latest state or user profile updates, need to be transferred from the first HID controller to the second HID controller at the actual moment of handover.
  • the DUC is operable to execute a plurality of DUC commands and/or the DUC is operable to generate a plurality of DUC outputs.
  • the plurality of DUC commands and/or DUC outputs define a set of capabilities of the DUC.
  • the HID may be operable to detect a plurality of different user inputs and/or the HID is operable to render a plurality of different user outputs.
  • the plurality of user inputs and/or user outputs defines a set of capabilities of the HID.
  • the HID controller may be operable to provide a mapping between a subset of capabilities of the HID and a subset of capabilities of the DUC.
  • the HID controller provides a translation for a basic capability set which comprises the set of joint capabilities of the HID and the DUC.
  • the set of capabilities of the DUC and/or the set of capabilities of the HID may be organized using a hierarchical naming scheme, where similar capabilities are grouped in equivalence classes.
  • the equivalence classes may use the concept of inheritance comprising a parent capability and at least one child capability associated with the parent capability.
  • a parent capability may be a fast forward operation. Possible child operations may be different speeds of fast forward operations.
  • an HID controller which provides ubiquitous interfacing between an HID and a DUC.
  • the HID controller is physically separate from the HID and the DUC and it is operable to receive from the HID a user input. Such user input has been detected by the HID as outlined above.
  • the HID controller relates and translates the user input to a corresponding DUC command and transmits the DUC command to the DUC for execution.
  • the HID controller may be operable to receive a DUC output from the DUC and to determine a user output corresponding to the DUC output. It may further translate the DUC output into the corresponding user output and send the user output to the HID for rendering to a user.
  • the HID controller may comprise the aspects outlined in conjunction with the system and method of the present document.
  • a method for controlling a DUC by a user comprises the steps of detecting a user input at a HID and transmitting the user input to a HID controller which is physically separate from the HID and the DUC. The method proceeds in receiving the user input at the HID controller, as well as in relating and translating the user input to a corresponding DUC command. Then, the DUC command is transmitted for execution to the DUC. Finally, the DUC may receive and execute the DUC command. Furthermore, a method for providing output from a DUC to a user is described. The method comprises the steps of receiving a DUC output generated by the DUC at a HID controller.
  • a user output corresponding to the DUC output is determined and the DUC output is translated into the corresponding user output.
  • the method proceeds in transmitting the user output to a HID which is physically separate from the HID controller. Then, the user output is received at the HID and rendered to the user.
  • Fig. 1 illustrates the communication between a human interface device and a device under control according to an embodiment of the invention
  • Fig. 2 shows a flow diagram describing an embodiment of the communication between HID, HID controller and DUC;
  • Fig. 3 illustrates an embodiment of the registration and handover process of an HID.
  • the human interface device senses the human input and/or renders the output for the human with- out further processing and/or interpretation of the user input and/or user output. It thereby only requires low computational power and consumes little energy. By consequence, the HID can be made small, wearable and unobtrusive.
  • the HID sends input and/or receives output via a standardized energy efficient medium range, i.e. in the order of tens of meters, wireless access infrastructure.
  • a possible wireless access infrastructure is a power reduced IEEE 802.16m, i.e. evolved WiMAX, access network in a femto cell deployment that connects the HID to the communication infrastructure, e.g. to the Internet.
  • other wireless communication infrastructures may be used, such as power-controlled WLAN, Bluetooth or LTE Advanced.
  • the HID may be part of a body area network comprising a separate human interface device and a separate component for communication with the base station.
  • the HID only requires ultra low power near field, i.e. in the order of tens of centimeters, communication, e.g. ultra wide band (UWB).
  • UWB ultra wide band
  • a slightly larger wearable relay device or communication component then interfaces the body area network with the medium range wireless access infrastructure.
  • HID controller which may be, for example, the femto base station of the wireless access infrastructure or a home server.
  • HID controller does not suffer from the severe restrictions on size, energy and computational power that limit the HID. Therefore, such HID controllers may enable a good adaptation to the user preferences through the use of complex adaptation algorithms.
  • algorithm may recognize and re-adjust recognition parameters as the user changes his particular way to control DUCs.
  • the HID controller may also evaluate usage patterns in order to identify the preferences of a user to provide commands to a DUC and/or to receive feedback from a DUC.
  • the HID controller may perform the actual translation between the perceived user input and the intended command to the DUC.
  • the HID controller may rearrange and/or filter the displayed output content according to the preferences, limits and/or guidelines set out by the user.
  • the HID controller is positioned closely to the HID itself, e.g. directly within a femto base station.
  • the physical proximity between HID and the HID controller keeps communication latencies low and makes the HID as reactive as if the HID controller and its HID software would be executed directly within the HID itself.
  • the HID controller which typically comprises a software program that interprets input and/or renders the output, is directly related to the particular HID used by the user. In other words, the HID controller is directly associated with the HID and may therefore be adapted to its technical characteristics.
  • the HID controller may comprise software which allows for the communication between the HID and the HID controller and which performs the interpretation of the HID inputs and the rendering of the HID outputs.
  • HID software may be downloaded from a storage entity into the HID controller for execution in its standardized runtime environment.
  • the storage entity may be of any appropriate form, e. g. centralized on a server like a home location register or distributed as a peer-to-peer network implementing a distributed hash table like Chord. The latter also allows the HID controller to be part of the storage entity itself.
  • the execution environment of the HID con- trailer should allow the HID software to communicate with its HID and with the device that the user whishes to control through this HID, i.e.
  • the execution environment of the HID controller may additionally allow the HID software to communicate directly with the storage entity for performing a back up of its internal state, e. g. for performing a back up of adaptations to user preferences or newly added control and/or output capabilities.
  • the DUC status may be stored in the HID controller and eventually backed up in the storage entity.
  • mobility may be provided through a network of base stations, whereas each base station comprises an HID controller.
  • HID controller When a user moves or travels he/she may move away from the base station and the HID controller that his/her HID is currently connected to. Eventually another base station and HID controller may become the nearest and it would be desirable that the HID can connect to the now nearest base station.
  • the HID software including its current internal state, may be transferred from the current HID controller to the new HID controller, in order to keep communication latency between HID and its HID controller and the respective HID software small.
  • a continuous network of HID controllers associated to the respective HID can be established.
  • HID controller This will provide an enlarged coverage, as well as improved connectivity between the HID and the associated HID controller and/or HID software.
  • locating the nearest HID controller and transferring HID software can be integrated into the handover process. Otherwise, a standardized HID controller location discovery and context trans- fer protocol may need to be implemented by the base stations and HID controllers.
  • FIG. 1 An example for the man-machine interface architecture 100 according to the invention is illustrated in Fig. 1.
  • a user 101 uses a preferred HID 102.
  • This HID 102 may for example be a microphone through which the user 101 may provide voice input.
  • the user 101 may use a small speaker as an output HID 102 in order to receive feedback from devices under control (DUCs).
  • the HID 102 connects with a base station 105 which is associated with an HID controller 103 running on a possibly standardized run- time environment.
  • the HID controller 103 comprises HID software 104 which communicates with the HID 102 via a HID communication protocol 130.
  • the physical communication between the HID 102 and the base station 105 is performed by a wireless access communication protocol 121, such as WLAN or WiMAX.
  • the base station 105 may for example be positioned at the users' living room. As the user moves in a direction indicated by the arrow 120, another base station 108, e.g. provided in the kitchen, may come into reach and may be eventually closer than the initial base station 105. In such a case, a wireless connection 122 may be established between the HID 102 and the base sta- tion 108.
  • a handover process 123 takes place which provides the HID software 104 in the former HID controller 103 as HID software 107 to the new HID controller 106. By providing such handover process 123, it is assured that the user 101 can make use of the HID 102 in an identical manner, regardless of his position.
  • Such handover may also comprise the handover of the current state of the HID, the HID controller and the DUG.
  • the HID software 104 may be backed up in a central repository or storage entity 110.
  • the backed-up version of the HID software 104 is illustrated by the box with the reference sign 111.
  • Such storage entity 110 may have the func- tion of a home location register (HLR) known from GSM networks and may comprise a master version of the HID software used for a particular HID 102.
  • the back-up process may be performed using a communication protocol 131.
  • the HID software 104 communicates with a DUG 109, e.g. via a middle ware like UPnP (Universal Plug and Play) 132.
  • middle ware like UPnP (Universal Plug and Play) 132.
  • UPnP Universal Plug and Play
  • middle ware should typically be standardized, e.g. an extension to UPnP.
  • the HID controllers 103 and 106 which comprise the HID software 104 and 107, respectively, are connected to an ideally global network 112 through which the handover communication 123 takes place.
  • the DUC 109 is connected to this network 112 through which the middle ware communication 132 takes place.
  • the storage entity 110 is connected to this network 112 through which the back-up communication 131 takes place.
  • the network 112 are e.g. the Internet, a corpo- rate intranet or a home automation network.
  • Another aspect of the invention relates to the communication between the HID 102 software and the DUC 109, which is preferably performed via a standardized middle ware.
  • the DUC 109 is connected to the same network 112, e.g. the Internet, as the HID controller. Such interconnection may be physically performed across a WLAN or a wireline network like Ethernet.
  • the middle ware may incorporate means for feature negotiation between the HID 102 and the DUC 109. In oth- er words, the middle ware may allow the HID 102 and the DUC 109 to inform the respective other party about their available features.
  • the HID 102 and the DUC 109 may describe their features as capabilities and semantics in a ma- chine processible way e. g. by using XML (Extensible Markup Language).
  • the capabilities are a description of the features provided by the devices, i.e. the HID 102 and the DUC 109.
  • the semantics are a description of such features for the user.
  • the semantics associated to a device capability provide a description of such capability in a way that is unders- tandable to a user.
  • a capability is unknown for either one of HID 102 or DUC 109, then such capability cannot be used for communication between the HID 102 and DUC 109.
  • artificial intelligence may be available that allows the HID 102 and the DUC 109 to understand such unknown capabilities and act on behalf of the user. Otherwise, the unknown capabilities may be brought to the user's attention by showing him the associated semantics. These semantics allow the user to understand the feature and to adapt the HID 102 and eventually also the DUC 109, accordingly, in order to make the capability available for use.
  • each HID 102 and DUC 109 is provided with a basic set of capabilities that every HID 102 and DUC 109 understands. Such basic set of capabilities may be the basis for allowing the involvement and interaction of the user. Examples are textual outputs for receiving instructions from the DUC or an on/off function for activating a DUC. As a further example, it is assumed that a DUC 109 offers a new feature. Such new feature may e.g. be a new functionality provided by a TV receiver. Before such a feature can be used, the user needs to be aware of the availability of the feature and he must understand what the feature does and how it may be operated.
  • the DUC 109 would utilize the basic set of capabilities of one of the user's output HID, e. g. to print or to voice output regarding the semantics of the newly available feature and regarding the DUC 109 on which such new feature is available.
  • the semantics might e.g. consist of a simple iconic description, a text with or without illustrations, pointers to further background information like a manual, tutorial, encyclopedia entry, etc.
  • the user may import the new capability from the DUC 109 or from a database associated with the DUC 109 into the HID controller 103. Consequently, the HiD controller 103 knows how to activate a certain capability on the DUC 109, or inversely, how to receive a certain output from the DUC 109.
  • the adaptation to a new feature further comprises assigning a new input HID command e.g. a gesture and/or a new output HID representation like a display menu item to the new capability. These assignments may be based on defaults included in the capability of the DLJC 109 feature, e. g. a default icon for the feature or a default name usable as voice command.
  • the HID controller 103 is aware of the new capability of the DUC 109 and no further engagement by the user is required. The user may use the new capability in accordance to the preferences defined by the user.
  • the described process of adaptation to a new DUC 109 capability has to be done only once. After installation, the feature or capability is known by the user's HID 102 and requires no further engagement by the user when interacting with this kind of DUC or with this kind of DUC model in the future. Furthermore, when other vendors adopt the particular feature into their products, the feature is simply available in the same manner without requiring the user to perform another learning and remembering process to a different way of activating a similar capability in another DUC. This benefit is achieved through the fact that the user can perform exactly the same HID command and/or receive exactly the same HID output, whereas the HID controller 103 performs the necessary translation to and from the instruction set required by the respective DUC. By way of example, a user may perform similar or identical func- tions, such as the access of a web page, on different DUCs, e.g. a PC, a mobile phone or an internet capable TV receiver, using identical HID commands.
  • the HID controller 103 performs a back-up of the HID software 104 or of the changed in- ternal state of the HID software 104 at the storage entity 110. This can be performed by the HID controller 103 when triggered by the HID software, thereby eliminating the need for the HID software to communicate directly with the storage entity 110. This may be beneficial from a security point of view.
  • the capabilities of the HID 102 and the DUC 109 may be unambiguously identified by using a hierarchical naming scheme. Sim- ilar device capabilities from different vendors may then be identified by maintaining mappings from the specific device capabilities into generically applicable equivalence classes of capabilities. Known capabilities are thereby classified according to equivalence classes of capabilities. New capabilities may typically not be classified within these equivalence classes, but when a new capability becomes de-facto standard, this new capability may form a new equivalence class and its hierarchical name may become the name of the new capability and its future variants.
  • a basic form of automatism in adaption between the HID 102 and the DUC 109 capabilities may be achieved by using the concept of inheritance, i.e. the organization of capabilities in a hierarchical parent-child manner.
  • the capabilities are then organized in groups of parent or base capabilities, such as the capability of performing a fast forward operation.
  • additional child or sub-capabilities are provided, e.g. the capability of performing a 4 times, an 8 times and a 16 times fast forward operation.
  • An HID 102 that knows the parent or base capability of a DUC 109 is then able to use unknown capabilities at least to the extend known to it by its base capability or base capability equivalence class.
  • the HID 102 would therefore be able to perform a normal fast forward operation, but may not be able to perform the accelerated fast forward operations which are available as sub-capabilities within the capability equivalence class.
  • the user is still required to perform the above mentioned training and engagement process, but the full adaptation process could be delayed as the user considers appropriate.
  • a new TV receiver i.e. a new DUC 109
  • a new remote control with a multitude of buttons each equipped with several functions.
  • a user is able to control the device through a preferred HID 102 and through the use of a preferred set of commands e. g. per voice command.
  • the HID 102 knows these preferred commands and their associated features or capabilities because it has learned them from the user in the context of training sequences performed when enabling similar capabilities of other DUCs. New features of such DUCs have been included to the available set of capabilities when they became available. Through the use of equivalence classes such capabilities are also available for the control of the new TV receiver.
  • the base station 105 in the hotel room can access the storage entity 110 and download the current HID software 104 and thereby provide the appropriate HID controller 103 for the users' HID 102.
  • the HID controller 103 and/or the DUC 109 may inform the user about non available functionalities through a process as outlined above.
  • Another example is the control of an audio/video/room installation in an auditorium or in a video conference room, e. g. the input selection, the volume control, the light control and/or the control of the jalousies.
  • the present invention allows a presenter to control such DUCs with the HID 102 and using the commands that he is used to.
  • Fig. 2 shows an embodiment of the communication between an HID 102, an HID controller 103 and a DUC 109.
  • the flow chart 200 describes the commu- nication of a user input from the HID to the DUC, whereas the flow chart 210 shows the provisioning of a DUC output to a user.
  • both communication flows may be provided within the same HID, HID controller and DUC.
  • step 201 the user performs a user input which is sensed by the HID.
  • the HID does not provide any interpretation of such user input, but only performs the task of measuring such user input, e.g. by registering that a soft button has been pressed on the touch screen of a smart phone.
  • step 202 the HID transmits the user input to a HID controller which is physically separate from the HID and typically positioned at a few meters away from the HID.
  • the HID controller receives the user input in step 203 and identifies its intended meaning in step 204. In other words, the interpretation of a user input is performed by the HID controller. This requires the user input to be unambiguously related to an intended meaning.
  • the user may be prompted in an additional step to specify the user input.
  • the user may be prompted to specify if the turning of a rotary button on the HID is directed at a volume increase of the TV receiver or at an increase of the brightness of the TV screen.
  • the HID may anticipate a user requirement and as a consequence it may anticipate a user input.
  • the HID may register increasing background noise, e.g. a car passing by, and as a consequence, it may trigger the volume increase of the TV receiver.
  • the HID controller performs a mapping of the user input to a cor- responding DUG command. This mapping is done based on the identified meaning of the user input and a user profile available at the HID controller.
  • the DUC command is transmitted to the DUC, which is also physically separate from the HID controller.
  • the DUC may be positioned in the vicinity of the HID and/or the HID controller, but it may also be positioned at very remote places. By way of example, a user may program his hard disk recorder at home while being at work. In such cases, the DUC may be far away from the HID controller and the communication between the DUC and the HID controller may be performed via wide area networks, such as the Internet.
  • the DUC receives the DUC command and executes the command in step 208.
  • the flow chart 210 illustrates the reverse case of a communication from the DUC to the user.
  • the DUG generates a DUC output, e.g. a menu provided by a TV receiver.
  • This DUC output is sent to the HID controller in step 212.
  • the HID controller receives the DUC output in step 213 and determines the user output which corresponds to the DUC output in step 214.
  • the HID controller may know that the user prefers to receive menus from the DUCs as output on the touch screen of his smart phone. Consequently, the HID controller knows that the DUC output needs to be translated into a GUI adapted for the users' smart phone.
  • the translation is performed in step 215, i.e.
  • a GUI is build up from the menu information received from the DUC.
  • Such GUI may comprise soft buttons to allow the user to reversely send commands to the DUC, simply by pressing the soft button on the touch screen of the smart phone.
  • the user output i.e. the GUI in the present example
  • the HID which receives the user output in step 217.
  • the user output is rendered to the user.
  • the HID controller may be aware of certain usage scenarios, e.g. of a usage scenario when traveling and another usage scenario when being at home. The user input and the user output would then be interpreted/ ren- dered depending on the situation.
  • Fig. 3 shows a flow chart of an embodiment of the registration and handover process between an HID 102 and an HID controller 103.
  • the HID is in search mode and tries to identify a base station and/or a standardized execution environment to which it can connect to. If such a base station has been identified then in step 302 it is determined, if the base station may act as a HID controller to the HID.
  • the standardized execution environment of the base station can act as a HID controller to the HID, if the respective HID software can be downloaded from a central repository and be executed on the standardized execution environment. If this is not the case (step 311), then the HID goes back to step 301 in order to identify other possible base stations.
  • step 312 the base station can act as a corresponding HID controller (step 312) then the method proceeds to step 303, where the HID controller is installed on the base station and where the HID registers with the HID controller.
  • step 304 the HID communicates with DUCs via the HID controller as outlined in relation to Fig. 2.
  • the HID regularly verifies its connection to the current base station and scans for other available base stations.
  • a second base station is identified. It is determined if this base station can implement the functionality of a second HID controller and if the latency to such second HID controller would be better than the latency to the current HID controller. If the second base station cannot take on the function of a second HID controller or if the latency would be sufficient (proceed to verification step 308), then the HID remains registered with the current HID controller.
  • step 315 If, on the other hand, it is deter- mined that the second HID controller would improve latency (step 315), then handover is performed to the second HID controller in step 307, which includes the steps of installing the HID controller on the second base station, transferring context from the HID controller on the first base station to the HID controller on the second base station, switching communication between HID and HID software from the first to the second base station and unloading the HID software from the execution environment of the first base station after some time.
  • the HID controller registers with the second HID controller in step 304.
  • step 308 If it is determined in step 308 that the connection to the current HID controller becomes too weak (step 314), the HID is disconnected from the HID controller and the HID controller is saved back to the central repository in step 306 and returns to the search mode of step 301. Otherwise the HID remains connected to the current base station (step 313).
  • the present invention comprises various aspects, in particular
  • HID i.e. its HID controller and/or HID software 104
  • HID software 104 • a standardized communication between the HID, i.e. its HID controller and/or HID software 104, and the controlled device through the use of a middle ware. This makes the HID reusable for many DUCs and thus avoids the time-consuming adaptation through the personalization process, notably the training phase.
  • a standardized mobile execution environment for the use of HID controllers.
  • HID software By running HID software on a standardized mobile execution environment the computational limits of the HID are compensated through outsourcing of the computational complex and energy consum- ing tasks from the HID into the execution environment and/or into the network.
  • a standardized execution environment allows the execution environment comprising the HID controller and the respective HID software to follow the HID as the user moves around, in order to keep communication latency low and interface reactivity high.
  • an unobtrusive wearable apparatus to be used as a human interface device, which allows the control of DUCs by e.g. gestures, gaze, speech or thought.
  • Such control can involve awareness of context and/or situation • the use of femto cells as wireless interface access points.
  • radio access network i.e. femto base stations with access to a storage entity and mobility support, for the provision of a mobile execution environment.
  • an architecture for a man-machine interface has been described which allows the provision of small, possibly wearable and unobtrusive, hu- man interface devices.
  • Such a reduction in size is achieved by limiting the human interface device to the sensing of an input performed by the user and/or the rendering of an output to the user.
  • the actual interpretation of the user input and/or the designing of the output according to user preferences are performed in a separate HID controller typically comprising a HID software.
  • the HID and the HID controller communicate via wireless access technologies. Due to the fact that the HID controller is not carried around by the user, it may comprise significant computing capacities in order to implement a ubiquitous man-machine interface, which disassociates the user input/output from the commands sent to and received from the DUC.
  • the HID software corresponding to a users' HID may be centrally stored in a storage entity. Furthermore, the HID software may be provided in a handover process to other wireless access points, in order to provide mobility of the user and his HID.
  • the man-machine interface architecture outlined in the present document the shortcomings of the prior art referred to in the introductory section may be overcome.
  • the logical separation of the HID and its commands and outputs from the DUC and its commands and outputs allows for a man- machine interface which is mobile and adaptable to the user preferences.

Abstract

The invention relates to man-machine interfaces. In particular, the invention relates to a man-machine interface which provides a ubiquitous communication between a user and a plurality of controlled devices. A system (100) is described which provides ubiquitous interfacing between a user (101) and a device under control, DUG (109). The system (100) comprises a human interface device, HID (102), operable to detect a user input from the user (101) and to transmit the user input to a HID controller (103). The system (100) further comprises the HID controller (103) which is physically separate from the HID (102) and the DUG (109) and operable to receive the user input from the HID (102), to relate and translate the user input to a corresponding DUC command and to transmit the DUC command to the DUC (109) for execution. The separation of HID and DUC makes the man-machine interface mobile and adaptable to the user preferences.

Description

Ubiquitous Interface Device Split
The invention relates to man-machine interfaces. In particular, the invention relates to a man-machine interface which provides a ubiquitous communication between a user and a plurality of controlled devices.
Today, every device provides its own human interface. By consequence, a user is faced with different remote controls for e.g. a TV receiver, a stereo, a DVD player or a home server. In addition, various human interfaces are provided for other appliances, such as kitchen appliances, communication systems, e.g. telephones, and computing systems, e.g. desktop PCs or laptops. Although there are de-facto standards, e. g. the functions used to control play- ers/recorders like record, play, forward, rewind, skip forward, skip back and their associated icons, most interfaces differ to some extent, e. g. in the placement of the buttons for these functions on a remote control. Furthermore, the more complex and/or special the functions become, the more the interfaces differ. In consequence, a user has to learn and remember, again and again separately for every device, how to control it. This becomes especially tedious, if interfaces only differ slightly so that mixing up commands and misconceiving outputs happens regularly.
Therefore it is desirable to provide ubiquitous man-machine interfaces which are adapted to the control habits of a user, so that a user does not need to adapt to the different interfaces of different controlled devices. This removes the need of a user to re-learn the principles of how to use similar or same functions on different devices. It is furthermore desirable to provide a human interface that is small and easy to carry for a user. In particular, it might be beneficial that such human interfaces are wearable and/or unobtrusive. A proposed solution therefore should take into account that battery capacity and computational power of such small human interfaces is severely limited and artificial intelligence is typically not available. By consequence, the pro- posed solution of a human interface should fulfill the two opposing requirements of flexible user adaptability and limited size.
A further aspect to be considered is that human interfaces typically provide input capabilities which allow a user to provide information and/or control data to a so-called device under control (DUC), e.g. a TV receiver or a DVD player, as well as output capabilities which provide feedback or recommendations from the DUC to the user. Examples for input interfaces comprise "learning remote controls" which are programmed either manually or automatically by receiving from the device under control the DUCs infrared or Bluetooth "command to function" mappings which may eventually be accompanied by a suitable button layout for the remote control. Another example for such input interfaces are configurable user interfaces which are based e.g. on soft keys. Examples for output interfaces are feedback screens on a TV receiver or beep tones from a mobile telephone. However, only very few DUCs allow customiza- tion of their output interface in layout and content.
It is therefore also desirable to provide a human interface that allows for adaptability to the user preferences both for input to the DUC, as well as for output from the DUC. Furthermore, the interface should be applicable to prac- tically all device control tasks, e. g. to control machines, one's PC/Desktop, consumer electronics or appliances in a smart home.
Examples for human interfaces which are also referred to as human interface devices (HID) are gesture recognizing wrist bands, collars detecting sub- vocalization, retina projectors, in-ear speakers, gaze detecting barrettes or even brain computer interfaces. Such HIDs are particularly well suited for the design of wearable unobtrusive wireless interfaces.
The present patent description provides a solution to the above mentioned problems known from the prior art. A basic idea of this solution is to split the interface, i.e. the HID, from the device under control, i.e. the DUC. By doing this, the interface can be permanently customized and adapted to the user's preferences, habits, capabilities etc. Through the use of middle ware which handles the communication between the interface and the device, it is possi- ble to allow arbitrary interfaces to control arbitrary devices. In order to allow for a large variety of HIDs to communicate and control a large variety of DL)Cs, it is preferably that such middle ware be standardized.
According to a first aspect of the invention a first system providing ubiquitous interfacing between a user and a device under control (DUC) is described. The system comprises a human interface device (HID) operable to detect a user input from the user and to transmit the user input to a HID controller. In other words, the HID senses the user input without performing additional processing such as a translation of the user input into a command which is executable by the DUC. In particular, the HID does not perform an interpretation of the user input, notably an interpretation with regards to an intended DUC command. Typical examples of HIDs or HID components are a wearable unobtrusive wireless interface, a microphone, a speaker, a display, a keyboard, a mouse, a gesture recognizing wrist band, a collar detecting sub-vocalization, a retina projector, an in-ear speaker, a gaze detecting barrette, a body area network comprising a common relay device for communication with the HID controller, a Graphical User Interface executed on an electronic device, a smart phone or a brain computer interface. Typical examples of DUCs are e.g. machines, personal computers, consumer electronics and smart home components, such as light dimmers, electrical stores, kitchen appliances, etc.. By way of example, an HID may be implemented as a graphical user interface (GUI) on a smart phone or a consumer device such as a mp3 player. Such a graphical user interface may be used to select a certain DUC, e.g. a stereo system to which a user input is directed. The GUI provides a set of possible user inputs which are predefined by the user and which correspond to the user's input preferences. By way of example, a volume control may be represented as a rotary button that is turned on the touch screen of the smart phone in order to increase the volume. The same rotary button may also be shown in conjunction with the control of light dimmers in a room or with the control of the volume of a TV receiver. As such, the rotary button may be the preferred user input to indicate a smooth increase/decrease of the controlled value. Depending on the DUC, such user input will control a different parameter or capability of the DUC.
The first system comprises further the HID controller which is physically separate from the HID and the DUC. In other words, the HID controller is not co- located with the HID and the DUC, but preferably situated at a home server or a base station. By way of example, the HID controller may be implemented on a residential base station such as a WLAN access point or a femto base sta- tion. Typically, such access points are standardized HW (hardware) platforms or standardized execution environments so that the HID controller may be implemented as HID software running on such standardized HW platform.
The HID controller is operable to receive the user input from the HID, relate and translate the user input to a corresponding DUC command and transmit the DUC command to the DUC for execution. In other words, the HID controller performs the interpretation of the bare user input and identifies the meaning of the user input which was intended by the user. Based on such interpretation a DUC command is generated and passed to the DUC. The generation of the DUC command may be done by mapping the user input to a DUC command using a correspondence table. In the example relating to the rotary but- ton, a user input signaling a partial turn of the button may be received at the HID controller. Furthermore, the HID controller may have knowledge, e.g. by former user input, about which DUC the user input relates to, e.g. to the stereo system. The HID controller maps the user input, i.e. the partial turn of the but- ton, to a DUC command, i.e. to a command indicating to the stereo system that the volume is to be increased by a certain amount. If, on the other hand, the HID controller is informed that the user input relates to another DUC, e.g. the light dimmer, then the HID controller translates the user input into a command instructing the light dimmer to increase the light by a certain amount.
In addition, the first system may also comprise the DUC which is controlled. The DUC is operable to receive the DUC command and to execute the DUC command. Furthermore, the system may comprise elements of the system providing DUC output to a user, as outlined below.
As outlined above, the DUC, the HID and the HID controller are physically separate. The HID, the HID controller and the DUC may be connected via a communication infrastructure, such as the Internet. A portable/wearable HID is typically connected via a wireless access infrastructure, such as a WLAN, a Bluetooth, a WiMAX, or an IEEE 802.16m network, while a stationary HID, e.g. a web camera used for gesture recognition, may be wire-bounded.
According to another aspect of the invention, a second system providing ubi- quitous interfacing between a DUC and a user is described. The system provides a DUC output to the user in a way that is in accordance with user preferences. Typically this system is provided in combination with the system of providing a user input to a DUC, as outlined above.
The second system comprises an HID controller which is physically separate from an HID and the DUC and which is operable to receive a DUC output from the DUC, to determine a user output corresponding to the DUC output, to translate the DUC output into the corresponding user output and to transmit the user output to the HID. In other words, the HID controller arranges/structures and transforms the DUC output according to the users' pre- ferences. By way of example, if the user uses a smart phone as a HID, the user may wish to receive the feedback from a DUC, e.g. his stereo system or TV receiver, on the touch screen of his smart phone. ATV receiver might send interactive menu information to the user which is transformed by the HID controller into a user output adapted for output on the smart phone of the user. Such user output may be based on certain graphical user interface adapted to the users' preferences.
The second system comprises further the HID which is operable to receive the user output and to render the user output to the user. In the context of the above example, the smart phone is able to provide the user output to the user, e.g. by displaying the interactive TV receiver menu on the touch screen, or by providing sound alerts, etc. It should be noted that the system may further comprise the DUC which is operable to generate the DUC output and to transmit the DUC output to the HID controller. Furthermore, the system may also be provided in combination with a system for providing user input to a DUC. By way of example, the interactive menu displayed on the touch screen on the smart phone may comprise soft buttons which are used to control the DUC.
According to a further aspect of the invention, the above mentioned systems may further comprise a storage entity which is operable to store a user profile comprising a mapping between the user input and the corresponding DUC command and/or a mapping between the DUC output and the corresponding user output. In other words, the systems may comprise a storage entity which provides a central repository for the translation functionality of the HID controller. As such the storage entity may also be used as a back-up system for the HID controller. The storage entity may store user profiles which are used by the HID controller to perform the translation between user preferred input/output and DUC command/outputs. It may also store the user specific HID software which comprises the user profiles and also the specific logic for communication with the HID.
The central repository functionality of the storage entity may be used to provide the user profile and/or the HID software to various different HID controllers and/or HW platforms/execution environments. Such different HID control- ler or HW platform may be operable to download the user profile or HID software from the storage entity and perform the translation based on the user profile or by executing the HID software. By doing this, any standardized HW platform, e.g. a base station, may be transformed into a user specific HID controller, thereby allowing a user to connect his HID to an arbitrary base station. By doing this, network coverage for a user's HID may be provided.
This functionality may be used to provide HID mobility. In an embodiment the systems comprise a plurality of HID controllers or a plurality of standardized HW platforms which may take on the function of a user and/or HID specific HID controller by downloading the user profile and/or HID software from a central repository. In such scenarios, the HID is operable to register with a first of the plurality of HID controllers in order to transmit the user input or receive the user output. In other words, the HID may select an appropriate HID controller which is within wireless reach and establish a communication with this HID controller, in order to send a user input to a DUC or to receive a user output from the DUC. If a plurality of HID controllers is within reach, the HID may register with the HID controller that provides the best reactivity of the HID to the user, e.g. the HID controller that has the lowest latency for communication between the HID and the HID controller and/or that has the highest available computing power. Accordingto a further aspect of the invention, the network of HID controllers or standardized HW platforms/ standardized execution environments may also provide seamless handover when an HID moves from one HID controller to another HID controller. As the HID moves from one location to another, the HID is operable to register with a second of the plurality of HID controllers and the first and second HID controllers are operable to perform a smooth/seamless handover. Such seamless handover from the first HID controller to the second HID controller may be implemented by passing the user profile and/or the HID software from the first base station to the second base station and thereby provide a copy of the first HID controller at the second location. The handover may also be implemented by making use of the storage entity which comprises a back-up copy of the user profile and/or HID software. Prior to handover, the second base station may download the HID software from the storage entity and thereby prepare the second HID control- ler. By doing this, only a reduced amount of data, e.g. only the latest state or user profile updates, need to be transferred from the first HID controller to the second HID controller at the actual moment of handover.
According to another aspect of the invention, the DUC is operable to execute a plurality of DUC commands and/or the DUC is operable to generate a plurality of DUC outputs. The plurality of DUC commands and/or DUC outputs define a set of capabilities of the DUC. Furthermore, the HID may be operable to detect a plurality of different user inputs and/or the HID is operable to render a plurality of different user outputs. The plurality of user inputs and/or user outputs defines a set of capabilities of the HID. The HID controller may be operable to provide a mapping between a subset of capabilities of the HID and a subset of capabilities of the DUC. In other words, the HID controller provides a translation for a basic capability set which comprises the set of joint capabilities of the HID and the DUC. The set of capabilities of the DUC and/or the set of capabilities of the HID may be organized using a hierarchical naming scheme, where similar capabilities are grouped in equivalence classes. Furthermore, the equivalence classes may use the concept of inheritance comprising a parent capability and at least one child capability associated with the parent capability. By way of example, a parent capability may be a fast forward operation. Possible child operations may be different speeds of fast forward operations.
According to another aspect of the invention an HID controller is described which provides ubiquitous interfacing between an HID and a DUC. The HID controller is physically separate from the HID and the DUC and it is operable to receive from the HID a user input. Such user input has been detected by the HID as outlined above. The HID controller relates and translates the user input to a corresponding DUC command and transmits the DUC command to the DUC for execution. In addition, the HID controller may be operable to receive a DUC output from the DUC and to determine a user output corresponding to the DUC output. It may further translate the DUC output into the corresponding user output and send the user output to the HID for rendering to a user. The HID controller may comprise the aspects outlined in conjunction with the system and method of the present document.
According to a further aspect of the invention a method for controlling a DUC by a user is described. The method comprises the steps of detecting a user input at a HID and transmitting the user input to a HID controller which is physically separate from the HID and the DUC. The method proceeds in receiving the user input at the HID controller, as well as in relating and translating the user input to a corresponding DUC command. Then, the DUC command is transmitted for execution to the DUC. Finally, the DUC may receive and execute the DUC command. Furthermore, a method for providing output from a DUC to a user is described. The method comprises the steps of receiving a DUC output generated by the DUC at a HID controller. Then a user output corresponding to the DUC output is determined and the DUC output is translated into the corresponding user output. The method proceeds in transmitting the user output to a HID which is physically separate from the HID controller. Then, the user output is received at the HID and rendered to the user.
It should be noted that the above mentioned aspects of the invention may be combined with one another or extracted from one another in various ways. In particular, all possible claim and feature combinations are considered to be disclosed by the present document. Furthermore, the aspects and features outlined in relation with a system are equally applicable in relation to the corresponding method.
The objects and features of the invention will become apparent from the following description of preferred embodiments. The present invention is described in the following by referring to exemplary embodiments illustrated schematically in the accompanying figures, wherein
Fig. 1 illustrates the communication between a human interface device and a device under control according to an embodiment of the invention;
Fig. 2 shows a flow diagram describing an embodiment of the communication between HID, HID controller and DUC; and
Fig. 3 illustrates an embodiment of the registration and handover process of an HID.
One of the aspects of the present invention is that the human interface device (HID) senses the human input and/or renders the output for the human with- out further processing and/or interpretation of the user input and/or user output. It thereby only requires low computational power and consumes little energy. By consequence, the HID can be made small, wearable and unobtrusive. In one embodiment, the HID sends input and/or receives output via a standardized energy efficient medium range, i.e. in the order of tens of meters, wireless access infrastructure. A possible wireless access infrastructure is a power reduced IEEE 802.16m, i.e. evolved WiMAX, access network in a femto cell deployment that connects the HID to the communication infrastructure, e.g. to the Internet. Also other wireless communication infrastructures may be used, such as power-controlled WLAN, Bluetooth or LTE Advanced.
In order to further reduce power consumption, the HID may be part of a body area network comprising a separate human interface device and a separate component for communication with the base station. By consequence, the HID only requires ultra low power near field, i.e. in the order of tens of centimeters, communication, e.g. ultra wide band (UWB). A slightly larger wearable relay device or communication component then interfaces the body area network with the medium range wireless access infrastructure.
The particular way that the input is interpreted and/or what output is to be rendered is not determined by the HID itself, but is off-loaded into an HID controller which may be, for example, the femto base station of the wireless access infrastructure or a home server. Such HID controller does not suffer from the severe restrictions on size, energy and computational power that limit the HID. Therefore, such HID controllers may enable a good adaptation to the user preferences through the use of complex adaptation algorithms. Such algorithm may recognize and re-adjust recognition parameters as the user changes his particular way to control DUCs. The HID controller may also evaluate usage patterns in order to identify the preferences of a user to provide commands to a DUC and/or to receive feedback from a DUC. Furthermore, the HID controller may perform the actual translation between the perceived user input and the intended command to the DUC. For outputting data from the DUC to the user, the HID controller may rearrange and/or filter the displayed output content according to the preferences, limits and/or guidelines set out by the user.
In a preferred embodiment, the HID controller is positioned closely to the HID itself, e.g. directly within a femto base station. The physical proximity between HID and the HID controller keeps communication latencies low and makes the HID as reactive as if the HID controller and its HID software would be executed directly within the HID itself. It should be noted that the HID controller, which typically comprises a software program that interprets input and/or renders the output, is directly related to the particular HID used by the user. In other words, the HID controller is directly associated with the HID and may therefore be adapted to its technical characteristics.
As mentioned above, the HID controller may comprise software which allows for the communication between the HID and the HID controller and which performs the interpretation of the HID inputs and the rendering of the HID outputs. Such HID software may be downloaded from a storage entity into the HID controller for execution in its standardized runtime environment. The storage entity may be of any appropriate form, e. g. centralized on a server like a home location register or distributed as a peer-to-peer network implementing a distributed hash table like Chord. The latter also allows the HID controller to be part of the storage entity itself. The execution environment of the HID con- trailer should allow the HID software to communicate with its HID and with the device that the user whishes to control through this HID, i.e. the device under control (DUC). Eventually the execution environment of the HID controller may additionally allow the HID software to communicate directly with the storage entity for performing a back up of its internal state, e. g. for performing a back up of adaptations to user preferences or newly added control and/or output capabilities. Furthermore, also the DUC status may be stored in the HID controller and eventually backed up in the storage entity.
In a further embodiment, mobility may be provided through a network of base stations, whereas each base station comprises an HID controller. When a user moves or travels he/she may move away from the base station and the HID controller that his/her HID is currently connected to. Eventually another base station and HID controller may become the nearest and it would be desirable that the HID can connect to the now nearest base station. For this purpose, the HID software, including its current internal state, may be transferred from the current HID controller to the new HID controller, in order to keep communication latency between HID and its HID controller and the respective HID software small. Furthermore, through the use of such handover between base stations, a continuous network of HID controllers associated to the respective HID can be established. This will provide an enlarged coverage, as well as improved connectivity between the HID and the associated HID controller and/or HID software. In case of femto base stations, locating the nearest HID controller and transferring HID software can be integrated into the handover process. Otherwise, a standardized HID controller location discovery and context trans- fer protocol may need to be implemented by the base stations and HID controllers.
An example for the man-machine interface architecture 100 according to the invention is illustrated in Fig. 1. A user 101 uses a preferred HID 102. This HID 102 may for example be a microphone through which the user 101 may provide voice input. Alternative or in addition, the user 101 may use a small speaker as an output HID 102 in order to receive feedback from devices under control (DUCs). The HID 102 connects with a base station 105 which is associated with an HID controller 103 running on a possibly standardized run- time environment. The HID controller 103 comprises HID software 104 which communicates with the HID 102 via a HID communication protocol 130. The physical communication between the HID 102 and the base station 105 is performed by a wireless access communication protocol 121, such as WLAN or WiMAX.
The base station 105 may for example be positioned at the users' living room. As the user moves in a direction indicated by the arrow 120, another base station 108, e.g. provided in the kitchen, may come into reach and may be eventually closer than the initial base station 105. In such a case, a wireless connection 122 may be established between the HID 102 and the base sta- tion 108. In parallel, a handover process 123 takes place which provides the HID software 104 in the former HID controller 103 as HID software 107 to the new HID controller 106. By providing such handover process 123, it is assured that the user 101 can make use of the HID 102 in an identical manner, regardless of his position. Such handover may also comprise the handover of the current state of the HID, the HID controller and the DUG.
The HID software 104 may be backed up in a central repository or storage entity 110. The backed-up version of the HID software 104 is illustrated by the box with the reference sign 111. Such storage entity 110 may have the func- tion of a home location register (HLR) known from GSM networks and may comprise a master version of the HID software used for a particular HID 102. The back-up process may be performed using a communication protocol 131.
The HID software 104 communicates with a DUG 109, e.g. via a middle ware like UPnP (Universal Plug and Play) 132. In order to allow communication with a large variety of DUG types, such middle ware should typically be standardized, e.g. an extension to UPnP.
As shown in Fig. 1 the HID controllers 103 and 106, which comprise the HID software 104 and 107, respectively, are connected to an ideally global network 112 through which the handover communication 123 takes place. Also the DUC 109 is connected to this network 112 through which the middle ware communication 132 takes place. In addition, also the storage entity 110 is connected to this network 112 through which the back-up communication 131 takes place. Examples for the network 112 are e.g. the Internet, a corpo- rate intranet or a home automation network.
Another aspect of the invention relates to the communication between the HID 102 software and the DUC 109, which is preferably performed via a standardized middle ware. The DUC 109 is connected to the same network 112, e.g. the Internet, as the HID controller. Such interconnection may be physically performed across a WLAN or a wireline network like Ethernet. Besides exchanging control commands from HID 102 to DUC 109 and information to render output from DUC 109 to HID 102, the middle ware may incorporate means for feature negotiation between the HID 102 and the DUC 109. In oth- er words, the middle ware may allow the HID 102 and the DUC 109 to inform the respective other party about their available features. In order to be able to determine the intersection of the features of the HID 102 and the DUC 109 which are available for control and/or output rendering, the HID 102 and the DUC 109 may describe their features as capabilities and semantics in a ma- chine processible way e. g. by using XML (Extensible Markup Language). In this context, the capabilities are a description of the features provided by the devices, i.e. the HID 102 and the DUC 109. The semantics are a description of such features for the user. In other words, the semantics associated to a device capability provide a description of such capability in a way that is unders- tandable to a user. If a capability is unknown for either one of HID 102 or DUC 109, then such capability cannot be used for communication between the HID 102 and DUC 109. In certain cases, artificial intelligence may be available that allows the HID 102 and the DUC 109 to understand such unknown capabilities and act on behalf of the user. Otherwise, the unknown capabilities may be brought to the user's attention by showing him the associated semantics. These semantics allow the user to understand the feature and to adapt the HID 102 and eventually also the DUC 109, accordingly, in order to make the capability available for use.
It may be beneficial that each HID 102 and DUC 109 is provided with a basic set of capabilities that every HID 102 and DUC 109 understands. Such basic set of capabilities may be the basis for allowing the involvement and interaction of the user. Examples are textual outputs for receiving instructions from the DUC or an on/off function for activating a DUC. As a further example, it is assumed that a DUC 109 offers a new feature. Such new feature may e.g. be a new functionality provided by a TV receiver. Before such a feature can be used, the user needs to be aware of the availability of the feature and he must understand what the feature does and how it may be operated. In order to provide such information to the user, the DUC 109 would utilize the basic set of capabilities of one of the user's output HID, e. g. to print or to voice output regarding the semantics of the newly available feature and regarding the DUC 109 on which such new feature is available. The semantics might e.g. consist of a simple iconic description, a text with or without illustrations, pointers to further background information like a manual, tutorial, encyclopedia entry, etc. When the user has understood the feature, he is able to decide and adapt his HID 102 in order to be able to use the new feature on the respective DUC 109. On the other hand, in case the user considers this feature to be unnecessary, he could ignore it and by consequence would also let his HID 102 and the HID controller 103 ignore the feature in the future.
In order to make the new feature available, the user may import the new capability from the DUC 109 or from a database associated with the DUC 109 into the HID controller 103. Consequently, the HiD controller 103 knows how to activate a certain capability on the DUC 109, or inversely, how to receive a certain output from the DUC 109. The adaptation to a new feature further comprises assigning a new input HID command e.g. a gesture and/or a new output HID representation like a display menu item to the new capability. These assignments may be based on defaults included in the capability of the DLJC 109 feature, e. g. a default icon for the feature or a default name usable as voice command. After such installation, the HID controller 103 is aware of the new capability of the DUC 109 and no further engagement by the user is required. The user may use the new capability in accordance to the preferences defined by the user.
The described process of adaptation to a new DUC 109 capability has to be done only once. After installation, the feature or capability is known by the user's HID 102 and requires no further engagement by the user when interacting with this kind of DUC or with this kind of DUC model in the future. Furthermore, when other vendors adopt the particular feature into their products, the feature is simply available in the same manner without requiring the user to perform another learning and remembering process to a different way of activating a similar capability in another DUC. This benefit is achieved through the fact that the user can perform exactly the same HID command and/or receive exactly the same HID output, whereas the HID controller 103 performs the necessary translation to and from the instruction set required by the respective DUC. By way of example, a user may perform similar or identical func- tions, such as the access of a web page, on different DUCs, e.g. a PC, a mobile phone or an internet capable TV receiver, using identical HID commands.
In order to make the adaptation to a new capability permanent, the HID controller 103 performs a back-up of the HID software 104 or of the changed in- ternal state of the HID software 104 at the storage entity 110. This can be performed by the HID controller 103 when triggered by the HID software, thereby eliminating the need for the HID software to communicate directly with the storage entity 110. This may be beneficial from a security point of view.
In a further embodiment, the capabilities of the HID 102 and the DUC 109 may be unambiguously identified by using a hierarchical naming scheme. Sim- ilar device capabilities from different vendors may then be identified by maintaining mappings from the specific device capabilities into generically applicable equivalence classes of capabilities. Known capabilities are thereby classified according to equivalence classes of capabilities. New capabilities may typically not be classified within these equivalence classes, but when a new capability becomes de-facto standard, this new capability may form a new equivalence class and its hierarchical name may become the name of the new capability and its future variants.
In this context, a basic form of automatism in adaption between the HID 102 and the DUC 109 capabilities may be achieved by using the concept of inheritance, i.e. the organization of capabilities in a hierarchical parent-child manner. The capabilities are then organized in groups of parent or base capabilities, such as the capability of performing a fast forward operation. Within a group of a base capability, additional child or sub-capabilities are provided, e.g. the capability of performing a 4 times, an 8 times and a 16 times fast forward operation. An HID 102 that knows the parent or base capability of a DUC 109 is then able to use unknown capabilities at least to the extend known to it by its base capability or base capability equivalence class. For the above example, the HID 102 would therefore be able to perform a normal fast forward operation, but may not be able to perform the accelerated fast forward operations which are available as sub-capabilities within the capability equivalence class. In order to achieve a full adaptation between the HID 102 and the DUC 109, the user is still required to perform the above mentioned training and engagement process, but the full adaptation process could be delayed as the user considers appropriate.
In an example scenario for the above described HID architecture, a new TV receiver, i.e. a new DUC 109, equipped with the above cited features does not have a dedicated user interface, i.e. a new remote control with a multitude of buttons each equipped with several functions. Instead a user is able to control the device through a preferred HID 102 and through the use of a preferred set of commands e. g. per voice command. The HID 102 knows these preferred commands and their associated features or capabilities because it has learned them from the user in the context of training sequences performed when enabling similar capabilities of other DUCs. New features of such DUCs have been included to the available set of capabilities when they became available. Through the use of equivalence classes such capabilities are also available for the control of the new TV receiver.
In addition, when the user is on a business trip he will be able to use the same commands to control the TV receiver in the hotel room because the base station 105 in the hotel room can access the storage entity 110 and download the current HID software 104 and thereby provide the appropriate HID controller 103 for the users' HID 102. Through the use of capability equivalence classes, features of the TV receiver in the hotel room will be mapped to the features known to the HID 102. On the other hand, features which are not supported by the hotel TV receiver will be dropped. The HID controller 103 and/or the DUC 109 may inform the user about non available functionalities through a process as outlined above.
Another example is the control of an audio/video/room installation in an auditorium or in a video conference room, e. g. the input selection, the volume control, the light control and/or the control of the jalousies. There are currently vastly differing interfaces that make presenters struggle with these interfac- es. The present invention allows a presenter to control such DUCs with the HID 102 and using the commands that he is used to.
Fig. 2 shows an embodiment of the communication between an HID 102, an HID controller 103 and a DUC 109. The flow chart 200 describes the commu- nication of a user input from the HID to the DUC, whereas the flow chart 210 shows the provisioning of a DUC output to a user. As already highlighted above, both communication flows may be provided within the same HID, HID controller and DUC.
In step 201 the user performs a user input which is sensed by the HID. The HID does not provide any interpretation of such user input, but only performs the task of measuring such user input, e.g. by registering that a soft button has been pressed on the touch screen of a smart phone. In step 202 the HID transmits the user input to a HID controller which is physically separate from the HID and typically positioned at a few meters away from the HID. The HID controller receives the user input in step 203 and identifies its intended meaning in step 204. In other words, the interpretation of a user input is performed by the HID controller. This requires the user input to be unambiguously related to an intended meaning. If this is not the case, the user may be prompted in an additional step to specify the user input. By way of example, the user may be prompted to specify if the turning of a rotary button on the HID is directed at a volume increase of the TV receiver or at an increase of the brightness of the TV screen.
It is also contemplated that the HID may anticipate a user requirement and as a consequence it may anticipate a user input. By way of example, the HID may register increasing background noise, e.g. a car passing by, and as a consequence, it may trigger the volume increase of the TV receiver.
In step 205, the HID controller performs a mapping of the user input to a cor- responding DUG command. This mapping is done based on the identified meaning of the user input and a user profile available at the HID controller. The DUC command is transmitted to the DUC, which is also physically separate from the HID controller. The DUC may be positioned in the vicinity of the HID and/or the HID controller, but it may also be positioned at very remote places. By way of example, a user may program his hard disk recorder at home while being at work. In such cases, the DUC may be far away from the HID controller and the communication between the DUC and the HID controller may be performed via wide area networks, such as the Internet. In step 207 the DUC receives the DUC command and executes the command in step 208.
The flow chart 210 illustrates the reverse case of a communication from the DUC to the user. In step 211, the DUG generates a DUC output, e.g. a menu provided by a TV receiver. This DUC output is sent to the HID controller in step 212. The HID controller receives the DUC output in step 213 and determines the user output which corresponds to the DUC output in step 214. By way of example, the HID controller may know that the user prefers to receive menus from the DUCs as output on the touch screen of his smart phone. Consequently, the HID controller knows that the DUC output needs to be translated into a GUI adapted for the users' smart phone. The translation is performed in step 215, i.e. a GUI is build up from the menu information received from the DUC. Such GUI may comprise soft buttons to allow the user to reversely send commands to the DUC, simply by pressing the soft button on the touch screen of the smart phone. In step 216 the user output, i.e. the GUI in the present example, is transmitted to the HID, which receives the user output in step 217. Eventually, in step 218 the user output is rendered to the user. It should be noted that the above mentioned user preferences may be situation dependant. The HID controller may be aware of certain usage scenarios, e.g. of a usage scenario when traveling and another usage scenario when being at home. The user input and the user output would then be interpreted/ ren- dered depending on the situation.
Fig. 3 shows a flow chart of an embodiment of the registration and handover process between an HID 102 and an HID controller 103. In step 301 the HID is in search mode and tries to identify a base station and/or a standardized execution environment to which it can connect to. If such a base station has been identified then in step 302 it is determined, if the base station may act as a HID controller to the HID. By way of example, the standardized execution environment of the base station can act as a HID controller to the HID, if the respective HID software can be downloaded from a central repository and be executed on the standardized execution environment. If this is not the case (step 311), then the HID goes back to step 301 in order to identify other possible base stations. If, on the other hand, the base station can act as a corresponding HID controller (step 312) then the method proceeds to step 303, where the HID controller is installed on the base station and where the HID registers with the HID controller. In step 304, the HID communicates with DUCs via the HID controller as outlined in relation to Fig. 2.
The HID regularly verifies its connection to the current base station and scans for other available base stations. In step 305 a second base station is identified. It is determined if this base station can implement the functionality of a second HID controller and if the latency to such second HID controller would be better than the latency to the current HID controller. If the second base station cannot take on the function of a second HID controller or if the latency would be sufficient (proceed to verification step 308), then the HID remains registered with the current HID controller. If, on the other hand, it is deter- mined that the second HID controller would improve latency (step 315), then handover is performed to the second HID controller in step 307, which includes the steps of installing the HID controller on the second base station, transferring context from the HID controller on the first base station to the HID controller on the second base station, switching communication between HID and HID software from the first to the second base station and unloading the HID software from the execution environment of the first base station after some time. Once the handover is performed, the HID controller registers with the second HID controller in step 304.
If it is determined in step 308 that the connection to the current HID controller becomes too weak (step 314), the HID is disconnected from the HID controller and the HID controller is saved back to the central repository in step 306 and returns to the search mode of step 301. Otherwise the HID remains connected to the current base station (step 313).
As outlined above the present invention comprises various aspects, in particular
• a physical split up of the human interface device from the device under control. The adaption of the man-machine interface to the user preferences by persistent personalization makes intelligence and notably ar- tificial intelligence in the DUC unnecessary. Such intelligence is provided by the HID controller and the HID software.
• an adapting and learning behavior of the man-machine interface which allows for a "natural", human oriented man-machine communication.
• a standardized communication between the HID, i.e. its HID controller and/or HID software 104, and the controlled device through the use of a middle ware. This makes the HID reusable for many DUCs and thus avoids the time-consuming adaptation through the personalization process, notably the training phase.
• a standardized wireless interface access which connects the HID to a communication network that also connects the DUC.
• a standardized mobile execution environment for the use of HID controllers. By running HID software on a standardized mobile execution environment the computational limits of the HID are compensated through outsourcing of the computational complex and energy consum- ing tasks from the HID into the execution environment and/or into the network. Furthermore, such a standardized execution environment allows the execution environment comprising the HID controller and the respective HID software to follow the HID as the user moves around, in order to keep communication latency low and interface reactivity high. • an unobtrusive wearable apparatus to be used as a human interface device, which allows the control of DUCs by e.g. gestures, gaze, speech or thought.
• Such control can involve awareness of context and/or situation • the use of femto cells as wireless interface access points.
• the use of a radio access network, i.e. femto base stations with access to a storage entity and mobility support, for the provision of a mobile execution environment.
• the possibility to select only a subset of the device features to be con- trolled by the users' HID, which reduces the complexity and increases the usability of the device.
• a capability registry which allows control of similar devices without requiring a re-learning stage.
• the use of an inheritance model to hierarchically organize capabilities. This allows the partial control of successor generation models of DUCs without re-learning existing capabilities to the extent of the predecessor model and/or the base of inheritance.
• the reduction of expensive development of user interfaces, which is replaced by simple default assignments to capability equivalence classes, which are to be included into the device's capabilities descriptions.
In summary, an architecture for a man-machine interface has been described which allows the provision of small, possibly wearable and unobtrusive, hu- man interface devices. Such a reduction in size is achieved by limiting the human interface device to the sensing of an input performed by the user and/or the rendering of an output to the user. The actual interpretation of the user input and/or the designing of the output according to user preferences are performed in a separate HID controller typically comprising a HID software. The HID and the HID controller communicate via wireless access technologies. Due to the fact that the HID controller is not carried around by the user, it may comprise significant computing capacities in order to implement a ubiquitous man-machine interface, which disassociates the user input/output from the commands sent to and received from the DUC. The HID software corresponding to a users' HID may be centrally stored in a storage entity. Furthermore, the HID software may be provided in a handover process to other wireless access points, in order to provide mobility of the user and his HID. Using the man-machine interface architecture outlined in the present document, the shortcomings of the prior art referred to in the introductory section may be overcome. In particular, the logical separation of the HID and its commands and outputs from the DUC and its commands and outputs allows for a man- machine interface which is mobile and adaptable to the user preferences.

Claims

1. A system (100) providing ubiquitous interfacing between a user (101) and a device under control, DUC (109), the system (100) comprising:
- a human interface device, HID (102), operable to
-detect a user input from the user (101); and -transmit the user input to a HID controller (103);
- the HID controller (103), physically separate from the HID (102) and the DUG (109) and operable to
-receive the user input from the HID (102); -relate and translate the user input to a corresponding DUC command; and
-transmit the DUC command to the DUC (109) for execution.
2. A system (100) providing ubiquitous interfacing between a device un- der control, DUC, (109) and a user, the system (100) comprising:
- a human interface device, HID, controller (103), physically separate from an HID (102) and the DUC (109) and operable to
-receive a DUC output from the DUC; -determine a user output corresponding to the DUC out- put;
-translate the DUC output into the corresponding user output; and
- the HID (102), operable to -receive the user output from the HID controller (103); and -render the user output to the user (101).
3. The system (100) of claim 1 or 2, further comprising the DUG (109), operable to
- receive the DUC command from the HID controller (103) and to execute the DUG command; and/or
- generate the DUC output and to transmit the DUG output to the HID controller (103);
4. The system (100) of claim 1 or 2, further comprising a storage entity (110), operable to store a user profile comprising
- a mapping between the user input and the corresponding DUC command; and/or
- a mapping between the DUG output and the corresponding user output.
5. The system (100) of claim 4, wherein the HID controller (103) is further operable to
- download the user profile from the storage entity (110); and
- perform the translation based on the user profile.
6. The system (100) of claim 1 or 2, comprising a plurality of HID control- lers (103, 106) and wherein the HID (102) is operable to register with a first of the plurality of HID controllers (103) in order to transmit the user input or receive the user output.
7. The system (100) of claim 6, wherein the HID (102) registers with the first HID controller (103) which
- provides the best reactivity of the HID to user; and/or - is closest to the HID (102); and/or
- provides the lowest latency for communication between the HID (102) and the first HID controller (103); and/or
- has the highest available computing capacity.
8. The system (100) of claim 6, wherein as the HID (102) moves from one location to another, the HID (102) is operable to register with a second of the plurality of HID controllers (106); and the first and second HID controllers (103, 106) are operable to perform a handover of the con- trol of the HID (lO2).
9. The system (100) of claim 1 or 2, wherein the HID (102) comprises at least one of
- a wearable unobtrusive wireless interface; - a microphone;
- a speaker;
- a display;
- a keyboard;
- a mouse; - a gesture recognizing wrist band;
- a collar detecting sub-vocalization;
- a retina projector;
- an in-ear speaker;
- a gaze detecting barrette; - a body area network comprising a common relay device for communication with the HID controller;
- a Graphical User Interface executed on an electronic device;
- a smart phone;
- a brain computer interface. lO.The system (100) of claim 1 or 2, wherein
- the HID controller (103) and the DUG (109) are connected via a communication infrastructure (112); and
- the HID (102) and the HID controller (103) are connected via a wireless access infrastructure (105).
ll.The system (100) of claim 1 or 2, wherein
- the DUC (109) is operable to execute a plurality of DUG commands; - the DUC (109) is operable to generate a plurality of DUG outputs;
- the plurality of DUC commands and DUC outputs define a set of capabilities of the DUC (109);
- the HID (102) is operable to detect a plurality of different user inputs;
- the HID (102) is operable to render a plurality of different user outputs;
- the plurality of user inputs and user outputs define a set of capabilities of the HID (102); and - the HID controller (103) is operable to provide a mapping between a subset of capabilities of the HID (102) and a subset of capabilities of the DUC (109).
12.The system (100) of claim 11, wherein - the set of capabilities of the DUC (109) and/or the set of capabilities of the HID (102) are organized using a hierarchical naming scheme, where similar capabilities are grouped in equivalence classes; and
- the equivalence classes are using the concept of inheritance comprising a parent capability and at least one child capability associated with the parent capability.
13.The system (100) of claim 1 or 2, wherein the HID controller (103) comprises a standardized execution environment and a HID software (104).
14.The system (100) of claim 1 or 2, wherein the
- translation between the user input and the DUG (109) command; and/or - translation between the DUG (109) output and the user output; is context dependent.
15. A method for controlling a device under control, DUC, (109) by a user
(101), comprising - detecting a user input at a human interface device, HID (102);
- transmitting the user input to a HID controller (103) which is physically separate from the HID (102) and the DUG (109);
- receiving the user input at the HID controller (103);
- relating and translating the user input to a corresponding DUC command; and
- transmitting the DUC command for execution to the DUC (109).
16.A method for providing output from a device under control, DUC, (109) to a user (101), comprising: - receiving a DUC output generated by the DUC at a human interface device, HID, controller (103);
- determining a user output corresponding to the DUC output;
- translating the DUC output into the corresponding user output;
- transmitting the user output to a HID (102) which is physically separate from the HID controller (103);
- receiving the user output at the HID (102); and - rendering the user output at the HID (102) to the user (101).
17.A human interface device, HID, controller (103) providing ubiquitous interfacing between an HID (102) and a device under control, DUC (109), the HID controller (103) being physically separate from the HID (102) and the DUG (109) and being operable to
- receive from the HID (102) a user input, detected by the HID (102);
- relate and translate the user input to a corresponding DUC command; and
- transmit the DUC command to the DUC (109) for execution; and/or
- receive a DUC output from the DUC;
- determine a user output corresponding to the DUC output; - translate the DUG output into the corresponding user output; and
- send the user output to the HID (102) for rendering to a user (101).
PCT/IB2009/006047 2009-05-15 2009-05-15 Ubiquitous interface device split WO2010131067A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2009/006047 WO2010131067A1 (en) 2009-05-15 2009-05-15 Ubiquitous interface device split

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2009/006047 WO2010131067A1 (en) 2009-05-15 2009-05-15 Ubiquitous interface device split

Publications (1)

Publication Number Publication Date
WO2010131067A1 true WO2010131067A1 (en) 2010-11-18

Family

ID=41503625

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/006047 WO2010131067A1 (en) 2009-05-15 2009-05-15 Ubiquitous interface device split

Country Status (1)

Country Link
WO (1) WO2010131067A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102111315A (en) * 2010-12-20 2011-06-29 大唐移动通信设备有限公司 Method, system and equipment for fusing Internet of Things and telecommunication network
WO2016039653A1 (en) 2014-09-09 2016-03-17 Software 4E Spółka Z Ograniczoną Odpowiedzialnością System for controlling an electronic device
WO2018172197A1 (en) * 2017-03-20 2018-09-27 Audi Ag Method and device for controlling a terminal
CN109032332A (en) * 2018-06-21 2018-12-18 深圳市满心科技有限公司 Man-machine interaction method, system, storage medium and the intelligent terminal of intelligent terminal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005017712A2 (en) * 2003-08-15 2005-02-24 The Board Of Trustees Of The Leland Stanford Junior University Intelligent total access system
US6983418B1 (en) * 1995-03-24 2006-01-03 The Board Of Trustees Of The Leland Stanford Junior Varsity Devices and methods for interfacing human users with electronic devices
US20080010482A1 (en) * 2006-06-13 2008-01-10 Microsoft Corporation Remote control of a media computing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6983418B1 (en) * 1995-03-24 2006-01-03 The Board Of Trustees Of The Leland Stanford Junior Varsity Devices and methods for interfacing human users with electronic devices
WO2005017712A2 (en) * 2003-08-15 2005-02-24 The Board Of Trustees Of The Leland Stanford Junior University Intelligent total access system
US20080010482A1 (en) * 2006-06-13 2008-01-10 Microsoft Corporation Remote control of a media computing device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102111315A (en) * 2010-12-20 2011-06-29 大唐移动通信设备有限公司 Method, system and equipment for fusing Internet of Things and telecommunication network
WO2012083825A1 (en) * 2010-12-20 2012-06-28 大唐移动通信设备有限公司 Method, system and device for integrating internet of things with telecommunication network
WO2016039653A1 (en) 2014-09-09 2016-03-17 Software 4E Spółka Z Ograniczoną Odpowiedzialnością System for controlling an electronic device
WO2018172197A1 (en) * 2017-03-20 2018-09-27 Audi Ag Method and device for controlling a terminal
CN109032332A (en) * 2018-06-21 2018-12-18 深圳市满心科技有限公司 Man-machine interaction method, system, storage medium and the intelligent terminal of intelligent terminal

Similar Documents

Publication Publication Date Title
US9893934B2 (en) System and method of controlling surrounding devices, based on topology
EP2887194B1 (en) Method for controlling a composition of a screen and electronic device thereof
CN105634881B (en) Application scene recommendation method and device
US9678650B2 (en) Method and device for controlling streaming of media data
US8938261B2 (en) Mobile terminal and method of controlling the same
JP5952301B2 (en) Apparatus and method for remotely controlling peripheral devices in a mobile communication terminal
CN105493479B (en) More equipment wirelessly disable and enable
EP2672762A1 (en) Connecting the highest priority Bleutooth device to a mobile terminal
CN109379261A (en) Control method, device, system, equipment and the storage medium of smart machine
US20120159340A1 (en) Mobile terminal and displaying method thereof
CN112335204B (en) Local control and/or registration of smart devices by an assistant client device
CN105915401A (en) Firmware upgrading method and device of smart hardware, and equipment
CN111240555A (en) Intelligent Internet of things menu using camera device
KR20150072719A (en) Display apparatus and control method thereof
KR20130078329A (en) Electronic device, user input apparatus controlling electronic device and contol method thereof
JP7234379B2 (en) Methods and associated devices for accessing networks by smart home devices
US20150286509A1 (en) System and method for transport layer agnostic programming interface for use with smartphones
WO2010131067A1 (en) Ubiquitous interface device split
KR20120016184A (en) Control method for display having a plurality of display panel and apparatus thereof
KR20130001826A (en) Mobile terminal and control method therof
CN108829481A (en) The rendering method of remote interface based on controlling electronic devices
KR20120105318A (en) Method for sharing of presentation data and mobile terminal using this method
CN113784186B (en) Terminal device, server, and communication control method
CN105808038B (en) A kind of information processing method and electronic equipment
KR101748153B1 (en) Method for displaying information in home network and mobile terminal using this method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09785967

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09785967

Country of ref document: EP

Kind code of ref document: A1