WO2017001931A1 - User input processing for controlling remote devices - Google Patents
User input processing for controlling remote devices Download PDFInfo
- Publication number
- WO2017001931A1 WO2017001931A1 PCT/IB2016/001070 IB2016001070W WO2017001931A1 WO 2017001931 A1 WO2017001931 A1 WO 2017001931A1 IB 2016001070 W IB2016001070 W IB 2016001070W WO 2017001931 A1 WO2017001931 A1 WO 2017001931A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- alternative
- user input
- device identifier
- user
- electronic device
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/20—Binding and programming of remote control devices
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
- G08C2201/31—Voice input
Definitions
- Connected objects can be remotely controlled by mobile phones, for example by means of an application installed in the mobile phones providing a user interface.
- Electronic devices such as mobile phones that are used to remotely control connected objects now provide several ways to receive a user input and to turn it into an instruction for a remote connected object : touch display, keypad, microphone coupled with a speech-to-text application, etc.
- a first aspect of the invention concerns a method for managing a remote electronic device using a user interface on an electronic device, the electronic device being operatively connected to a database associating, for each given device identifier, alternative argument values and graphical elements, the method comprising: receiving a user input through the user interface on the electronic device;
- determining a first device identifier based on the user input accessing the database to retrieve a first graphical element associated with a first alternative argument value of the first device identifier, the first alternative argument value matching the user input ;
- the compatibility mesh can be configurable by another person than the user (for example a parent in case the user is a child, or any person being entitled to manage the compatibility mesh of the user).
- the compatibility mesh may also be configurable by a manufacturer of some of the connected objects, or by the service platform 14.
- the method enables to perform the search for an argument value in a subpart of the database 13 : indeed, only the argument values of the device identified by the first device identifier are taken into account as alternatives when determining the first alternative argument value.
- the required processing resources are therefore reduced compared to the solutions of the prior art.
- the retrieved first graphical element is chosen among alternative argument values of a determined device identifier, erroneous user can be detected and avoided, because only the parameter and function values that can be performed by the identified device can be selected as the first alternative argument value.
- step 206 upon selection of the first graphical element, all the retrieved graphical elements are outputted, so that the user can select one of the outputted graphical elements.
- the first graphical element is replaced by the second graphical element on the display of the user terminal 1 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
User input processing for controlling remote devices The invention relates to a method for managing a remote electronic device using a user interface on an electronic device, the electronic device being operatively connected to a database associating, for each given device identifier, alternative argument values and graphical elements, the method comprising: receiving (200) a user input through the user interface on the electronic device; determining (201) a first device identifier based on the user input; accessing (203) the database to retrieve (204) a first graphical element associated with a first alternative argument value of the first device identifier, the first alternative argument value matching the user input; rendering (205) the user input by outputting the retrieved first graphical element.
Description
User input processing for controlling remote devices
The present invention generally relates to the control of remote electronic devices, such as connected objects for example.
It finds applications in electronic devices such as laptops, touchpads, mobile phones, Smartphones, wearable devices or smart watches.
In what follows, we consider, for illustrative purpose exclusively, electronic devices as mobile phones and remote electronic devices as connected objects.
Connected objects can be remotely controlled by mobile phones, for example by means of an application installed in the mobile phones providing a user interface.
It is meant by "connected object' any object able to perform a given function, according to optional parameters, and comprising a network interface to access an access point, thereby accessing a remote server (through a network such as the internet for example or a local Bluetooth network), or to receive commands from a mobile phone (or any electronic device).
For example, a connected coffee machine can perform basic functions such as "switch on" and "switch off. However, "switch on" can encompass several functions such as "prepare a latte coffee", "prepare an expresso", "prepare a cappuccino". In addition, each device and function can also be associated with parameters. For example, the parameters can be a volume of the coffee to be prepared. The parameters may therefore comprise "2 centilitres", "5 centilitres", etc.
Electronic devices such as mobile phones that are used to remotely control connected objects now provide several ways to receive a user input and to turn it into an instruction for a remote connected objet : touch display, keypad, microphone coupled with a speech-to-text application, etc.
The user therefore needs to indicate information that enables the mobile phone to generate an instruction for the remote connected object.
Such information can be communicated using complementary arguments of different types :
- a device identifier to identify the connected objet to be controlled or the group of connected objects to be controlled;
- a function identifier to identify the function to be performed by the identified connected object;
- optionally, a parameter identifier identifying a parameter of the identified function to be performed by the identified connected objet. To indicate this information, argument values can be included in a user input such as a sentence "could you switch-on the bedroom light in blue". This sentence comprises three arguments, of different types : a device identifier having the value "bedroom light', a function identifier having the value "switch-on" and a parameter identifier having the value "blue".
However, as the size of these electronic devices is a growing concern, their screens tend to be smaller and smaller. In particular, areas intended to receive user inputs are particularly small. Moreover, some specific users (children, elders, etc.) can have trouble inputting data in these small and condensed input areas.
To solve these problems, methods have been proposed where graphical elements are used instead of words (characters) to render user inputs. Using these graphical elements provides an optimized interface as it makes the display of more information on a smaller screen possible. Moreover, children or elders have more facility to distinguish what is input on the screen, even on small screens.
To replace user input with graphical elements, words are usually extracted from a user input and compared with large databases making one-by- one correspondences between words and graphical elements. As many words are present in these databases, the user input must be accurate so that the correct word is extracted and the correct graphical element is retrieved from databases.
However, input errors are frequent, especially when it comes to small screens and children or elders. This is even more problematic in case of a user input comprising a speech-to-text conversion, which is not always reliable. These problems are particularly important when user inputs are used to remotely control
connected objects, as users are generally in a situation where they cannot be focused on the electronic device remotely controlling the connected object. Moreover, finding a given word in a large database requires a large amount of processing resources.
The document US 201 1 /0313775 discloses a computer-implemented method for information sharing between a portable computing device and a television system.
The document US 2012/0030712 discloses a method and system for network-integrated remote control that include voice activation of a user interface context on a remote control device.
To address this need, a first aspect of the invention concerns a method for managing a remote electronic device using a user interface on an electronic device, the electronic device being operatively connected to a database associating, for each given device identifier, alternative argument values and graphical elements, the method comprising: receiving a user input through the user interface on the electronic device;
determining a first device identifier based on the user input; accessing the database to retrieve a first graphical element associated with a first alternative argument value of the first device identifier, the first alternative argument value matching the user input ; rendering the user input by outputting the retrieved first graphical element.
Therefore, the database from which graphical elements are retrieved stores associations between graphical elements and alternative argument values, for respective device identifiers, thereby reducing the processing resources that are required to retrieve a graphical element. Indeed, a preliminary "filtering" step can be performed by determining the first device identifier.
According to some embodiments of the invention, upon reception of the user input, the method may further comprise detecting an initial argument value in the user input, and the first alternative argument value may match the detected initial argument value.
Alternatively, contextual data of the user input can be used to determine the first alternative value. To this end, a current state of the remote electronic device identified by the first device identifier can be taken into account. For example, if the remote electronic device is a light, which functions include switch on and switch off only, and if the light is currently on, it can be deduced from a user input such as "light please", that the function value (argument value) to be selected as a first alternative argument value is "switch on".
However, the accuracy of the selection of the first alternative argument value is improved when the user input comprises the initial argument value.
In some embodiments, the method may further comprise receiving contextual data with the user input, the alternative argument values of the first device identifier can be filtered using the contextual data, and the first alternative argument value can be chosen among the filtered alternative argument values.
This improves the robustness of the remote control as it is ensured that the alternative argument value respects some contextual conditions.
According to some embodiments, the method may further comprise receiving contextual data with the user input, and the first device identifier can be determined using the contextual data.
This improves the accuracy and robustness of the determination of the first device identifier. The remote control is therefore more accurate because it is ensured that the alternative argument values correspond to the right device identifier (the one intended by the user).
Alternatively or in complement, the contextual data may comprise at least one among:
- location of the user;
- weather forecast data;
- temperature data;
- clock data;
- presence data;
- a user profile; or
- user rights to access and control remote electronic devices.
According to some embodiments, the method may further comprise retrieving the other graphical elements that are associated with the alternative argument values of the first device identifier, other than the first alternative argument value, the first graphical element can be a selectable element, the selection by the user of the selectable element outputting at least some of the retrieved graphical elements, and upon selection of a second graphical element, the first graphical element can be replaced by the second graphical element.
Modification of the user input is thereby simplified and accelerated, as the user does not have to input a new argument by typing characters or acquiring a voice input.
According to some embodiments, the method may further comprise sending an instruction to a remote electronic device identified by the first device identifier, the instruction comprising the argument value associated with the outputted graphical element.
In complement, the instruction can be sent to the remote electronic device upon validation by the user.
This avoids error manipulations by the user.
According to some embodiments of the invention, the alternative argument values of a given device identifier comprise alternative function values, each alternative function value identifying a function to be performed by a device identified by the first device identifier, and the first alternative argument value is a function value.
This ensures that the instructions sent to the remote electronic device can be executed and also limits the complexity associated with the search of the first alternative argument value.
In complement, the alternative argument values of a given device identifier further comprise alternative parameter values identifying parameters of functions performed by the device identified by the first device identifier, and a second graphical element is retrieved from the database, the second graphical element being associated with a second alternative argument value of the first device identifier, the second alternative argument value matching the user input
and being an alternative parameter value. As a variant, instead of matching the user input, the second alternative argument value may be compatible with contextual data in a compatibility mesh.
This enables to enrich the user input as parameter of functions can be further specified.
In some embodiments, the method further comprises displaying the graphical element on the remote electronic device.
This enables to verify that the instruction sent corresponds to the user input.
A second aspect of the invention concerns a computer program product recorded on a storage medium and executable by a computer in the form of a software including at least one software module setup to implement the method according to the first aspect of the invention.
A third aspect of the invention concerns an electronic device for managing a remote electronic device using a user interface on an electronic device, the electronic device being operatively connected to a database associating, for each given device identifier, alternative argument values and graphical elements, the electronic device comprising:
a user interface arranged for receiving a user input;
a processor arranged for performing the following steps:
determining a first device identifier based on the user input; accessing the database to retrieve a first graphical element associated with a first alternative argument value of the first device identifier, the first alternative argument value matching the user input ;
rendering the user input by outputting the retrieved first graphical element.
A fourth aspect of the invention concerns a system comprising the electronic device according to the third aspect of the invention and a database associating, for each given device identifier, alternative argument values and graphical elements.
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which like reference numerals refer to similar elements and in which:
- Figure 1 represents a system according to some embodiments of the invention;
- Figure 2 is a flowchart illustrating the steps of a method according to some embodiments of the invention;
- Figures 3a to 3d illustrate a graphical user interface of a user terminal according to some embodiments of the invention;
- Figure 4 illustrates a user terminal according to some embodiments of the invention.
Figure 1 illustrates a system according to some embodiments of the invention.
The system comprises an electronic device 1 such as a user terminal. As illustrated on Figure 1 , the electronic device 1 can be a mobile phone such as a Smartphone. The invention is not limited to this illustrative example, and also encompasses touch pads, laptops, desktop computers, etc.
The electronic device 1 can be used by the user to control remote electronic devices such as connected objects. In what follows, "remote electronic devices" are called "connected objects", for illustrative purposes.
The user of the electronic device 1 (also called "user terminal 1" hereafter) may have several connected objects located in different places :
- a connected coffee machine 10.1 located at home;
- a bedroom light 10.2 located at home;
- a kitchen light 10.3 located at home;
- a car light 10.4 located in a car.
Connected objects 10.1 -10.3 are configured to access to a first access point 1 1 .1 and connected objet 10.4 is configured to access to a second access point 1 1 .2. No restriction is attached to the access technology used by the
connected objects: wired or wireless means can be envisaged. In what follows, we consider the first and second access points as being Wi-fi access points.
Through the access points 1 1 .1 and 1 1 .2, the connected objects can access a network 12, such as the internet, and communicate with a remote service platform 14, for example to upload events or to receive instructions. Alternatively, the network 1 2 can be a local Bluetooth network and the remote service platform 14 is replaced by a local management entity. Several respective service platforms can also be envisaged for the different connected objects 10.1 to 10.4, and a unique service platform 14 has been represented for illustrative purposes exclusively.
The service platform 14 can also be accessed by the user terminal 1 . For example, the user terminal can connect the access point 1 1 .1 when the user is at home. Alternatively, the user may use a 3G/4G mobile network to access the service platform 14 with the user terminal 1 .
An application dedicated to the service platform can be installed on the user terminal 1 . Through this application, the user may enter user inputs that are converted to instructions. The instructions can then be used to control the connected objects 10.1 -10.4. To this end, the instructions may be forwarded to the service platform 14, which then forwards (and optionally modifies or enriches) the instructions to the targeted connected objects 10.1 -10.4.
The user terminal 1 is also operatively connected to a database 1 3. On figure 1 , the database 13 can be accessed through the network 12. Alternatively, the database 1 3 can be stored in the user terminal 1 or can be integrated in the service platform 14.
The system illustrated on figure 1 is only provided for illustrative purposes. For example, the network used to control the connected objects 1 0.1 - 10.4 can be a local access network, such as a Bluetooth network for example, and, in that case, there is no need to access to a remote service platform.
As detailed hereafter, the database 13 stores, for each device identifier among a plurality of device identifiers, alternative argument values and graphical elements. Therefore, for each connected object 10.1 -10.4, identified by a respective device identifier, alternative argument values correspond to graphical
elements. Alternative argument values can comprise alternative function values and alternative parameter values. For a given device identified by a first device identifier, each alternative function value identifies a function to be performed (and that can be performed) by the given device, and an alternative parameter value identifies parameters of functions to be performed by the given device.
The database 13 may also store a compatibility mesh between the alternative argument values and contextual data. The alternative argument values can therefore be filtered by contextual data using the compatibility mesh.
It is meant by "compatibility mesh", any data structure that links elements that are compatible together. Based on a given element, it is possible to retrieve all the other elements that are linked to this given element in the compatibility mesh.
No restriction is attached to the format of the compatibility mesh. The compatibility mesh will be better understood in the description hereafter.
Such a compatibility mesh can be configured by the user, for example using the user terminal 1 or any device that is adapted to access the database 13.
Alternatively, the compatibility mesh can be configurable by another person than the user (for example a parent in case the user is a child, or any person being entitled to manage the compatibility mesh of the user). The compatibility mesh may also be configurable by a manufacturer of some of the connected objects, or by the service platform 14.
Figure 2 is a diagram illustrating the steps of a method for managing a remote electronic device according to some embodiments of the invention.
At step 200, a user input is received through a user interface of the user terminal 1 . No restriction is attached to the first user input, which can be a pronounced sentence acquired by a microphone and converted to text, or a character string input using a touch pad or a keyboard. At step 200, the user terminal 201 may also receive contextual data. No restriction is attached to what is meant by contextual data. The contextual data may comprise at least one element among
- location of the user acquired by a sensor, acquired for example through a GPS of the user terminal, or acquired using the location of the access point to which the user terminal 1 accesses the network 12;
- weather forecast data acquired via an application installed on the user terminal 1 for example; or
- temperature data acquired by a sensor of the user terminal 1 , or acquired via the network 12; or
- clock data indicating the time at which the user input is received. Clock data can be retrieved from a clock application of the user terminal 1 for example; or
- other contextual data acquired by a sensor, retrieved from an external entity or input by the user in the user input.
Based on the user input, a first device identifier is determined at step 202. For example, the user input may comprise an argument value of the device identifier type, such as "coffee machine", "light, "bedroom light, "car light, etc. The connected object corresponding to the argument value is retrieved by the user terminal 1 . In case the argument value does not correspond to any of the connected objects 10.1 -10.4 of the user, an error message can be displayed on the user terminal 1 or a list of device identifiers that are compatible with contextual data might be displayed for selection by the user.
In addition, the contextual data may be used to determine the first device identifier. For example, if the argument value in the user input is "light', and that the user owns several lights {"car light' , "bedroom light', kitchen light'), contextual data indicating that the user is in the car can be used to deduce the device identifier "car light' .
At step 202, the method may also comprise detecting an initial argument value in the user input (other than an argument value of the device identifier type). For example, the initial argument value may be a parameter value or a function value, as it will be better understood when referring to Figures 3a to 3d.
At step 203, the user terminal 1 accesses the database 13, and retrieves, at step 204, a first graphical element associated with a first alternative argument
value of the first device identifier, the first alternative argument value matching the user input.
For example, if the initial argument value has been detected at step 202, the first alternative argument value matches the initial argument value. If the initial argument value is the function value "switch on", then the first alternative argument value is also the function value "switch on". However, even when no initial argument value has been detected at step 202, the first alternative argument matching the user input can be determined. For example, if the bedroom light is currently off, it can be deduced from the user input "bedroom light please", that the user wants to switch on the bedroom light, and the first alternative argument value is therefore the function value "switch on".
In addition, if contextual data has been received by the user terminal 1 , the alternative argument values of the first device identifier can be filtered using the contextual data and the compatibility mesh, and the first graphical element associated with the first alternative argument value is then retrieved from the filtered alternative argument values.
For example, the contextual data can be weather forecast data. When the weather forecast data indicates that the weather is sunny, the function values of the kitchen light can be filtered. For example, only the function "switch off can be compatible with the contextual data "sunny" in the compatibility mesh, whereas "switch on" is not compatible with it. Therefore, only the graphical element associated with the function "switch off can be retrieved in the database 13 for the device identifier "kitchen light'.
At step 205, the user input is rendered by outputting the retrieved first graphical element on the user terminal 1 .
Therefore, the method enables to perform the search for an argument value in a subpart of the database 13 : indeed, only the argument values of the device identified by the first device identifier are taken into account as alternatives when determining the first alternative argument value. The required processing resources are therefore reduced compared to the solutions of the prior art. In addition, as the retrieved first graphical element is chosen among alternative argument values of a determined device identifier, erroneous user can be
detected and avoided, because only the parameter and function values that can be performed by the identified device can be selected as the first alternative argument value.
According to an embodiment of the invention, at step 204, all the other graphical elements that are associated with the alternative argument values (optionally filtered by the contextual data and the compatibility mesh) other than the first alternative argument value, can also be retrieved from the database. In addition, as step 205, the outputted first graphical element can be a selectable element.
In that case, at an optional step 206, upon selection of the first graphical element, all the retrieved graphical elements are outputted, so that the user can select one of the outputted graphical elements. Upon selection of a second graphical element by the user, the first graphical element is replaced by the second graphical element on the display of the user terminal 1 .
At step 207, the user may validate the user input rendered using the first graphical element (or the second graphical element replacing the first graphical element). This validation step by the user is optional. For example, the instruction can be automatically validated after expiration of a timer, or can be directly validated after replacement of the first graphical element by the second graphical element.
At step 208, optionally after validation by the user at step 207, an instruction is sent to the connected objet identified by the first device identifier, the instruction comprising the argument value associated with the outputted graphical element (first or second graphical element). For example, the instruction can be sent to the service platform 14 with the first device identifier, and the service platform 14 then forwards (and optionally modifies or enriches) the instruction to the identified connected objects.
Figures 3a to 3d illustrate a user interface on the user terminal 1 , according to some embodiments of the invention.
As illustrated on figure 3a, the first user input can be displayed on the touch screen as a graphical instruction 300, while the user is typing it (or while the
user is speaking in a microphone). In the specific example shown on figures 3a to 3d, the first input "Could you switch off the bedroom light in blue?" is received by the user terminal 1 , at step 200.
After the acquisition of the first user input, argument types can be detected at step 202 as shown on Figure 3b. In this example, three argument types having respective initial argument values are detected in the first input:
- a function identifier 301 .1 having the initial function value "switch-off;
- a device identifier 301 .2 having the initial device value "bedroom light';
- a parameter identifier 301 .3 having the initial parameter value "blue". The device identifier 301 .2 is used to determine the first device identifier at step 201 . The first device identifier is therefore "bedroom light' identifying the connected object 10.2.
As explained, the present invention proposes to retrieve graphical elements for the argument values detected in the user input 300.
To this end, the database 13 can be accessed by the user terminal 1 .
When accessing the database 13, the user terminal indicates the determined first device identifier "bedroom light", so that only the alternative argument values of the bedroom light 10.2 are considered. The initial function value "switch off is then used to determine the first alternative argument value among the plurality of alternative argument values. For example, the alternative argument values of the bedroom light identifier comprise:
- three alternative function values : switch off, switch on higher intensity, switch off lower intensity; and
- three alternative parameter values : blue, green, white.
"Switch off is therefore identified as the first alternative argument value, and the associated first graphical element is retrieved at step 204. The first graphical element is then outputted on the user terminal 1 . Referring to figure 3c, the first graphical element is referenced 302.1 .
A second graphical element can also be retrieved. Indeed, the initial parameter value "blue" is detected in the user input, and matches the second alternative argument value "blue" in the database 13. The second graphical
element associated with the second alternative argument value "blue" is also retrieved at step 204, and outputted on the user terminal 1 .
The second graphical element "blue" is referenced 302.3 on figure 3c.
As illustrated on figure 3c, a graphical element 302.2 can also be retrieved from the database 13 for the first device identifier "bedroom light'.
As previously explained, the first graphical element 302.1 (and also the second graphical element) can be a selectable element.
Upon selection by the user of the selectable element 302.1 , and as illustrated on figure 3d, a window 303 is outputted on the user terminal 1 , the window comprising the first graphical element and the graphical elements of the alternative parameter values other than the first alternative argument value.
A third graphical element 304.1 represents the function value "switch on higher intensity' and a fourth graphical element 304.2 represents the function value "switch on lower intensity". Selection by the user of an alternative argument value is therefore facilitated, especially for elders or children, and it avoids having to input a new argument value using a keyboard or a touch screen.
Figure 4 shows an electronic device 1 (or user terminal 1 ) according to some embodiments of the invention.
The user terminal 1 comprises a random access memory 403 and a processor 402 that can store instructions for performing the steps of a method as described above when referring to figure 2.
The user terminal 1 may also comprise a database 404 for storing data resulting from the method according to the invention. For example, the database 404 may store the data that are retrieved from the database 13 (associations between the alternative argument values and the graphical elements, compatibility mesh) and can store user identifiers. According to an embodiment, the database 13 can be stored in the user terminal, in database 404.
The user terminal 1 comprises a user interface 401 for receiving selections and user inputs by the user. The user interface 401 can for example comprise a touch display, a virtual or physical keyboard, press buttons, a camera and/or a microphone coupled to a speech-to-text application. The user terminal 1
also comprises a network interface 405 to communicate with the network 12 and in particular to transmit the user entries to the selected connected objects. The network interface can be a wired interface (Ethernet) or wireless (Bluetooth, 2G, 3G, 4G, Wi-fi, etc).
Claims
1 . A method for managing a remote electronic device (10.1 -10.4) using a user interface (401 ) on an electronic device (1 ), the electronic device being operatively connected to a database (13) associating, for each given device identifier, alternative argument values and graphical elements, the method comprising:
receiving (200) a user input through the user interface on the electronic device; determining (201 ) a first device identifier based on the user input;
accessing (203) the database to retrieve (204) a first graphical element associated with a first alternative argument value of the first device identifier, said first alternative argument value matching the user input ;
rendering (205) the user input by outputting the retrieved first graphical element.
2. The method according to claim 1 , wherein, upon reception of the user input, the method further comprises detecting (202) an initial argument value in the user input, and wherein the first alternative argument value matches the detected initial argument value.
3. The method according to claim 1 or 2, wherein the method further comprises receiving (200) contextual data with the user input;
wherein the alternative argument values of the first device identifier are filtered using the contextual data, and wherein the first alternative argument value is chosen among the filtered alternative argument values.
4. The method according to one of the preceding claims, wherein the method further comprises receiving (200) contextual data with the user input, and wherein the first device identifier is determined using the contextual data.
5. The method according to claim 3 or 4, wherein the contextual data comprises at least one among:
- location of the user;
- weather forecast data; - temperature data;
- clock data;
- presence data;
- a user profile; or
- user's rights to access and control remote electronic devices.
6. The method according to the preceding claims, wherein the method further comprises retrieving the other graphical elements that are associated with the alternative argument values of the first device identifier, other than the first alternative argument value; wherein the first graphical element is a selectable element, the selection by the user of said selectable element outputting at least some of the retrieved graphical elements, and wherein upon selection (206) of a second graphical element, the first graphical element is replaced by the second graphical element.
7. The method according to one of the preceding claims, further comprising sending (208) an instruction to a remote electronic device identified by the first device identifier, said instruction comprising the argument value associated with the outputted graphical element.
8. The method according to claim 7, wherein the instruction is sent to the remote electronic device upon validation (207) by the user.
9. The method according to one of the preceding claims, wherein the alternative argument values of a given device identifier comprise alternative function values, each alternative function value identifying a function to be performed by a device identified by the first device identifier, wherein the first alternative argument value is a function value.
10. The method according to claim 9, wherein the alternative argument values of a given device identifier further comprise alternative parameter value identifying parameters of functions performed by the device identified by the first device identifier, and wherein a second graphical element is retrieved from the database, said second graphical element being associated with a second alternative argument value of the first device identifier, said second alternative argument value matching the user input and being an alternative parameter value.
1 1 . The method according to one of the preceding claims, further comprising displaying the graphical element on the remote electronic device.
12. A computer program product recorded on a storage medium and executable by a computer in the form of a software including at least one software module setup to implement the method according to any of claims 1 to 1 1 .
13. An electronic device for managing a remote electronic device using a user interface (401 ), the electronic device being operatively connected to a database associating, for each given device identifier, alternative argument values and graphical elements, wherein the user interface is arranged for receiving a user input, the electronic device comprising a processor (402) arranged for performing the following steps:
determining a first device identifier based on the user input; accessing the database to retrieve a first graphical element associated with a first alternative argument value of the first device identifier, said first alternative argument value matching the user input ;
rendering the user input by outputting the retrieved first graphical element.
14. A system comprising the electronic device according to claim 13 and a database (13) associating, for each given device identifier, alternative argument values and graphical elements.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP16751000.7A EP3317866B1 (en) | 2015-06-30 | 2016-06-27 | User input processing for controlling remote devices |
US15/740,212 US10466870B2 (en) | 2015-06-30 | 2016-06-27 | User input processing for controlling remote devices |
ES16751000T ES2832485T3 (en) | 2015-06-30 | 2016-06-27 | User input processing to control remote devices |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15306051.2A EP3113140A1 (en) | 2015-06-30 | 2015-06-30 | User input processing for controlling remote devices |
EP15306051.2 | 2015-06-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017001931A1 true WO2017001931A1 (en) | 2017-01-05 |
Family
ID=53496605
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2016/001070 WO2017001931A1 (en) | 2015-06-30 | 2016-06-27 | User input processing for controlling remote devices |
Country Status (4)
Country | Link |
---|---|
US (1) | US10466870B2 (en) |
EP (2) | EP3113140A1 (en) |
ES (1) | ES2832485T3 (en) |
WO (1) | WO2017001931A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110313775A1 (en) | 2010-05-20 | 2011-12-22 | Google Inc. | Television Remote Control Data Transfer |
US20120030712A1 (en) | 2010-08-02 | 2012-02-02 | At&T Intellectual Property I, L.P. | Network-integrated remote control with voice activation |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7155213B1 (en) | 2005-09-16 | 2006-12-26 | James R. Almeda | Remote control system |
US8106742B2 (en) | 2006-08-04 | 2012-01-31 | Tegic Communications, Inc. | Remotely controlling one or more client devices detected over a wireless network using a mobile device |
US8564543B2 (en) * | 2006-09-11 | 2013-10-22 | Apple Inc. | Media player with imaged based browsing |
US10241752B2 (en) * | 2011-09-30 | 2019-03-26 | Apple Inc. | Interface for a virtual digital assistant |
WO2014098477A1 (en) * | 2012-12-18 | 2014-06-26 | 삼성전자 주식회사 | Method and device for controlling home device remotely in home network system |
US9575720B2 (en) * | 2013-07-31 | 2017-02-21 | Google Inc. | Visual confirmation for a recognized voice-initiated action |
US9430186B2 (en) * | 2014-03-17 | 2016-08-30 | Google Inc | Visual indication of a recognized voice-initiated action |
US9531601B2 (en) * | 2014-04-16 | 2016-12-27 | Belkin International Inc. | Modular interface framework for network devices |
US10761704B2 (en) * | 2014-06-16 | 2020-09-01 | Braeburn Systems Llc | Graphical highlight for programming a control |
EP2958010A1 (en) * | 2014-06-20 | 2015-12-23 | Thomson Licensing | Apparatus and method for controlling the apparatus by a user |
CN104902070A (en) * | 2015-04-13 | 2015-09-09 | 青岛海信移动通信技术股份有限公司 | Mobile terminal voice control method and mobile terminal |
KR20160147459A (en) * | 2015-06-15 | 2016-12-23 | 삼성전자주식회사 | Application Operating Method and electronic device using the same |
-
2015
- 2015-06-30 EP EP15306051.2A patent/EP3113140A1/en not_active Withdrawn
-
2016
- 2016-06-27 WO PCT/IB2016/001070 patent/WO2017001931A1/en active Application Filing
- 2016-06-27 US US15/740,212 patent/US10466870B2/en active Active
- 2016-06-27 EP EP16751000.7A patent/EP3317866B1/en active Active
- 2016-06-27 ES ES16751000T patent/ES2832485T3/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110313775A1 (en) | 2010-05-20 | 2011-12-22 | Google Inc. | Television Remote Control Data Transfer |
US20120030712A1 (en) | 2010-08-02 | 2012-02-02 | At&T Intellectual Property I, L.P. | Network-integrated remote control with voice activation |
Also Published As
Publication number | Publication date |
---|---|
US10466870B2 (en) | 2019-11-05 |
EP3317866B1 (en) | 2020-09-02 |
ES2832485T3 (en) | 2021-06-10 |
EP3317866A1 (en) | 2018-05-09 |
US20180267675A1 (en) | 2018-09-20 |
EP3113140A1 (en) | 2017-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108509119B (en) | Method for operating electronic device for function execution and electronic device supporting the same | |
US9922260B2 (en) | Scrapped information providing method and apparatus | |
US10387510B2 (en) | Content search method and electronic device implementing same | |
KR102246559B1 (en) | IoT management device capable of executing condition modification mode and its control method | |
CN104113774A (en) | Television device control method and system and television device | |
CN105554588B (en) | Closed caption-supporting content receiving apparatus and display apparatus | |
US20170242832A1 (en) | Character editing method and device for screen display device | |
JP6283749B2 (en) | Method and apparatus for prompting device connection | |
US20160261818A1 (en) | Cursor control method and cursor control device of smart tv | |
US20120119998A1 (en) | Server device, display operation terminal, and remote control system | |
US10908787B2 (en) | Method for sharing content information and electronic device thereof | |
US10645211B2 (en) | Text input method and electronic device supporting the same | |
KR102340251B1 (en) | Method for managing data and an electronic device thereof | |
US10802684B2 (en) | Remote control of an electronic device with a selectable element | |
US20160070425A1 (en) | Method and apparatus for index processing | |
US9756451B2 (en) | Terminal apparatus and information processing method | |
EP3317866B1 (en) | User input processing for controlling remote devices | |
EP3113139A1 (en) | Updating of former instructions to control a remote electronic device | |
JP5687552B2 (en) | Operation support system, display system, electronic device, script, operation support method and program | |
EP3113141A1 (en) | Enriched instructions for remotely controlling electronic devices | |
KR20160076394A (en) | Method for providing user interface, electronic apparatus and storage medium | |
KR102398453B1 (en) | Method for Outputting an Image and Electronic Device supporting the same | |
US20240073795A1 (en) | Information processing device, control method of information processing device, and program | |
KR101875485B1 (en) | Electronic apparatus and Method for providing service thereof | |
US10757242B2 (en) | Computer system, and method and program for setting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16751000 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15740212 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016751000 Country of ref document: EP |