US20170289239A1 - User interface delegation to a delegated device - Google Patents

User interface delegation to a delegated device Download PDF

Info

Publication number
US20170289239A1
US20170289239A1 US15/626,231 US201715626231A US2017289239A1 US 20170289239 A1 US20170289239 A1 US 20170289239A1 US 201715626231 A US201715626231 A US 201715626231A US 2017289239 A1 US2017289239 A1 US 2017289239A1
Authority
US
United States
Prior art keywords
delegated
user interface
delegation
delegating
related task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/626,231
Inventor
David Hirshberg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
VIBRANT LICENSING LLC
Original Assignee
Empire Technology Development LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Empire Technology Development LLC filed Critical Empire Technology Development LLC
Priority to US15/626,231 priority Critical patent/US20170289239A1/en
Assigned to EMPIRE TECHNOLOGY DEVELOPMENT LLC reassignment EMPIRE TECHNOLOGY DEVELOPMENT LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HIRSHBERG, DAVID
Publication of US20170289239A1 publication Critical patent/US20170289239A1/en
Assigned to CRESTLINE DIRECT FINANCE, L.P. reassignment CRESTLINE DIRECT FINANCE, L.P. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EMPIRE TECHNOLOGY DEVELOPMENT LLC
Assigned to EMPIRE TECHNOLOGY DEVELOPMENT LLC reassignment EMPIRE TECHNOLOGY DEVELOPMENT LLC RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: CRESTLINE DIRECT FINANCE, L.P.
Assigned to VIBRANT LICENSING LLC reassignment VIBRANT LICENSING LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EMPIRE TECHNOLOGY DEVELOPMENT LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/54Indexing scheme relating to G06F9/54
    • G06F2209/549Remote execution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/107Computer-aided management of electronic mailing [e-mailing]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/04Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas
    • H04B7/06Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station
    • H04B7/0686Hybrid systems, i.e. switching and simultaneous transmission
    • H04B7/0689Hybrid systems, i.e. switching and simultaneous transmission using different transmission schemes, at least one of them being a diversity transmission scheme
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/70Admission control; Resource allocation
    • H04L47/76Admission control; Resource allocation using dynamic resource allocation, e.g. in-call renegotiation requested by the user or requested by the network in response to changing network conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/10Architectures or entities
    • H04L65/1045Proxies, e.g. for session initiation protocol [SIP]

Definitions

  • the form factors of the devices make some devices more suitable than others for carrying out certain tasks. For example, a user may read a document on a mobile device but prefer to edit the document on a larger screen.
  • Existing methods for transitioning from a first device to a second device may be complex. For example, a user may be required to make a document available on the second device. In one example, if a user wishes to access an email on a second device, the user may be required to log into the email account associated with the email on the second device before accessing the email on the second device.
  • the present technology provides an illustrative delegating device.
  • the delegating device includes a user interface configured to receive an input and a transceiver configured to send and receive data.
  • the delegating device further includes a memory configured to store computer-executable instructions and a processor configured to execute the computer-executable instructions to perform various operations. Such operations include receiving a request to delegate a user interface related task to a delegated computing device, and causing the transceiver to send a delegation command to the delegated, computing device to establish a delegation session on the delegated computing device.
  • the delegation command requests establishment of a delegated user interface to facilitate performance of the user interface related task on the delegated computing device, and the delegated user interface is substantially similar to the user interface of the delegating device.
  • the operations further include exchanging data associated with the user interface-related task with the delegated computing device.
  • the present technology also provides an illustrative delegated device.
  • the delegated device includes a user interface configured to receive a user input and a transceiver configured to send and receive data.
  • the delegated device also includes a memory configured to store computer-executable instructions and a processor configured to execute the computer-executable instructions. Execution of the computer-executable instructions causes the delegated device to perform operations including receiving a delegation command from a delegating computing device to establish a delegation session, wherein the delegation command requests establishment of a user interface related task on the delegated device, and wherein the user interface related task was initiated at the delegating computing device. Execution of the computer-executable instructions also causes the delegated device to exchange data associated with the user interface-related task with the delegated computing device.
  • the present technology provides an illustrative method for delegating user interface related tasks.
  • the method includes receiving a request to delegate a user interface related task to a delegated computing device.
  • the method also includes causing the transceiver of the delegating device to send a delegation command to the delegated computing device to establish a delegation session on the delegated computing device.
  • the delegation command requests establishment of a delegated user interface to facilitate performance of the user interface related task on the delegated computing device, and the delegated user interface is substantially similar to the user interface of the delegating device.
  • the method further includes exchanging data associated with the user interface-related task with the delegated computing device.
  • the present technology further provides another illustrative method for delegating user interface related tasks.
  • the method includes receiving, at a delegated computing device, a delegation command from a delegating computing device to establish a delegation session.
  • the delegation command requests establishment of a user interface related task on the delegated computing device, and the user interface related task was initiated at the delegating computing device.
  • the method further includes exchanging, by the delegated computing device, data associated with the user interface-related task with the delegated computing device, and automatically updating, by the delegated computing, device, the delegating computing device with actions performed at the delegated computing device corresponding to the user interface related task.
  • FIG. 1 depicts a device having a delegating manager in accordance with an illustrative embodiment.
  • FIG. 2 depicts a system where user interface-related task delegation may be implemented in accordance with an illustrative embodiment.
  • FIG. 3 depicts a flow diagram of a process for delegating a user interface-related task to a delegated device in accordance with an illustrative embodiment.
  • FIG. 4 depicts a flow diagram of a process for establishing and closing a session between a delegated device and a delegating device in accordance with an illustrative embodiment.
  • FIG. 5 depicts a block diagram of a computer system, which may be used to delegate a user interface-related task in accordance with an illustrative embodiment.
  • user interface-related tasks include tasks that require user attendance or attention to be performed successfully. Examples of such user attendance or attention may include viewing a display, viewing a web page, listening to streamed audio. Additional examples of user attendance or attention include manipulation of a user interface which by way of example may include scrolling a screen, pausing or playing an audio or video stream, editing a file, selecting objects from a menu, selecting buttons on an interface, speaking to a device, or any other suitable user interface manipulation actions.
  • delegation of the user interface-related task involves delegation of a user interface from the delegating device to the delegated device.
  • a delegated device may use a pre-existing user interface to enable performance of the user interface-related task.
  • a user interface-related task may be delegated to the delegated device such that no further actions are required by the delegating device for performance of the user interface-related task.
  • An example of such a user interface-related task may include transferring the display/viewing of a video stream front the delegating device to the delegated device.
  • a delegated device may manage a user interface associated with the user interface-related task and after completion of the user interface-related task may send appropriate documents and/or instructions back to the delegating device.
  • the user interface-related task may involve editing a file, and the delegated device may control such editing. After completion oldie editing, the delegated device may transfer the edited file back to the delegating device.
  • delegation of the user interface-related task may involve delegation of at least a portion of a user interface associated with the user interface-related task such that a portion of the user interface-related task is performed on the delegated device while another portion of the user interface-related task continues to be performed by the delegating device.
  • a smartphone i.e., the delegating device
  • may delegate text entry to a personal computer i.e., the delegated device having a full keyboard while retaining the display functions and/or the actual file modification functions on the smartphone.
  • the text entry selections from the personal computer may be transferred to the smartphone, which processes the text entry selections and displays the modifications to the file to the user.
  • the delegating and delegated devices may communicate with each other using existing communication technologies including, but not limited to, Bluetooth, Infrared Data Association (“IRDA”), Near Field Communication (“NFC”), Local Area Networks (“LAN”), Wi-Fi, Wide Area Networks (“WAN”), etc. Additionally, the delegating and delegated devices may communicate using a web-based delegation connection server.
  • User interfaces and/or user interface-related tasks may be sent over established delegation sessions between the delegating device and the delegated device.
  • Delegation of a user interface-related task may include delegation of a specific task (e.g., rendering of a video, editing a document, editing or presenting data objects, editing email, opening a new file or webpage, etc.).
  • delegation of the user interface-relate task may include communication to the delegated device of data objects, a customized toolbar or interface for a user interface-related task, and configuration settings fir a user interface-related task.
  • Delegating a user interface-related task allows a user to initiate the user interface-related task on the delegating device and continue execution of the user interface-related task on the delegated device.
  • delegation of a user interface-related task may include delegation of a user interface associated with the user interface-related task. For example, a user may initially access an email on a smart phone, delegate the email user interface to a tablet, and continue reading or responding to the email on the tablet.
  • a streaming video may initially be accessed using a desktop computer. A video playback user interface and a task of rendering the video may be delegated to a tablet, such that the user may continue to view the video on the tablet.
  • the user may delegate only a user interface associated with controls for the video playback (i.e., a video playback control user interface) to the tablet. In this way, the user may control playback via, the video playback control user interface on the table while continuing to watch the video on the desktop computer.
  • a user may receive a text message on a smart phone. The user may delegate a user interface-related task associated with responding to the text message to a laptop, such that the user may respond to the text message using the keyboard of the laptop. This delegation may include the delegation of a user interface usable for responding to the text message.
  • FIG. 1 depicts a device having a delegating manager in accordance with an illustrative embodiment.
  • a device 102 includes a software stack having an application layer 104 , a device drivers and services layer 106 , an operating system (“OS”) kernel layer 108 , and a hardware layer 110 .
  • OS operating system
  • the application layer 104 may include a number of applications that may be run on the device. For example, a user may access applications 112 and 114 on the device 102 .
  • the applications may include, but are not limited to, an email application, a video playback application, a music player application, etc.
  • the device drivers and services layer 106 allows access to hardware and service components by the application layer 104 .
  • one such service may be the delegation manager software (“DLGM”) 116 .
  • the delegation manager software 116 may receive user in task delegation requests from the application layer 104 .
  • the DLGM 116 may be configured to initiate a point-to-point session between the delegating device and the delegated device.
  • the DLGM 116 includes a delegation server 118 and a delegation client 120 .
  • the DLGM 116 may be configured to create a delegation client 120 .
  • the DLGM 116 may be configured to create a delegation server 118 .
  • the delegation client 120 of the delegating device communicates with the delegation server 118 of the delegated device.
  • the device 102 may serve as a delegating device and a delegated device at different times or at the same time.
  • the DLGM 116 may create a multiple number of delegation clients 120 and a multiple number of delegation servers 118 .
  • a device 102 may delegate a user, interface-related task associated with application 112 to a first delegated device and a user interlace-related task associated with application 114 to a second delegated device.
  • the device 102 may create two delegation clients 120 , such that one is connected to the delegation server 118 of the first delegated device (e.g., the device that is delegated the user interface-related task associated with application 112 ) and the second delegation client is connected to the delegation server 118 of the second delegated device (e.g., the device that is delegated the user interface-related task associated with application 114 ). Additionally, the device 102 may also be delegated a user interface-related task from a second delegating device. Therefore, the device 102 may also create a delegation server 118 to receive requests from the second delegating device.
  • the delegation server 118 of the first delegated device e.g., the device that is delegated the user interface-related task associated with application 112
  • the second delegation client is connected to the delegation server 118 of the second delegated device (e.g., the device that is delegated the user interface-related task associated with application 114 ).
  • the device 102 may also be delegated
  • the OS kernel layer 108 may serve as an abstraction layer allowing access to hardware components.
  • the OS kernel layer 108 may be configured to allow the application layer 104 and the device drivers and services layer 106 to run on device 102 with different components without the need to be re-coded.
  • an application may be programmed once to run on an operating system and deployed on multiple devices 102 that am the same OS kernel 108 but have different hardware components.
  • the hardware layer 110 may include the hardware components of the device 102 . These components may include network communication devices, output and input devices such as, but not limited to, monitors, keyboards, touch screens, etc.
  • the application layer 104 and the device drivers and services layer 106 may access hardware components by interacting with the OS kernel layer 108 .
  • the DLGM 116 may establish a connection with the DLGM of a second device over a network communications device that is part of the hardware layer 110 of the device 102 .
  • FIG. 2 depicts a system 200 where user interface-related task delegation may be implemented in accordance with an illustrative embodiment.
  • Delegating device 202 A and delegated device 202 B may include a delegation manager 206 A and a delegation manager 206 B, respectively.
  • an application 204 A may request that a user interface-related task be delegated to another device.
  • a delegated device 202 B may be selected in one of several ways, including, but not limited to, searching for the delegated device 202 B on a web-based delegation connection server, searching previously selected devices, and broadcasting a request over a personal area network.
  • the application 204 A may search a web-based delegation connection server for a device to delegate the user interface-related task.
  • the devices may authenticate each other, as described below.
  • the devices 202 A and 202 B may be required to pair each time a delegation request is initiated.
  • a number of security and authentication frameworks may be employed when pairing the devices.
  • the user interface-related task delegation request and the delegated device 202 B selection may be sent to the DLGM 206 A.
  • the DLGM 206 A may receive a request from an application 204 A to delegate the user interface-related task to a delegated device 202 B.
  • the DLGM 206 A may send a delegation command to the DLGM 206 B of the delegated device 202 B.
  • the delegated device 202 B may create a delegation server 208 B and send an acknowledgment to the delegating device 202 A.
  • the DLGM 206 A may create a delegation client 210 A and initiate a session with the delegation server 208 B of delegation manager 206 B of delegated device 202 B. Once the session between the devices has been created, the user interface-related task may be sent to the delegated device 202 B over the session. The session may be dosed by either device.
  • Each device may be setup as a delegating device, a delegated device, or both.
  • the system may be symmetric in that both the delegating device 202 A and the delegated device 202 B may be capable of acting as both a delegating and a delegated device.
  • one or both of the delegating device 202 A and the delegated device 202 B may be both a delegating device and a delegated device simultaneously.
  • the delegated device 202 B may send a user interface-related task delegation request to delegating device 202 A. Because the devices are already paired, delegating device 202 A may start a delegation server 208 A and the delegated device 202 B may start a delegation client 210 B. Thus, devices 202 A and 202 B may send and receive user interface-related task delegation requests 212 , 214 at the same time.
  • FIG. 3 depicts a flow diagram of a process for delegating a user interface-related task to a delegated device in accordance with an illustrative embodiment. Additional, fewer, or different operations may be performed, depending on the embodiment.
  • a method 300 may be implemented on a computing device. In one embodiment, the method 300 is encoded on a computer readable medium that contains instructions that, when executed by a computing device, cause the computing device to perform operations of the method 300 .
  • the delegating device receives a user interface-related task delegation request from an application 204 A on a delegating device 202 A.
  • the delegating device sends a delegation command to the delegated device 202 B.
  • the delegated device 202 B receives the delegation command, extracts user interface-related task information from the delegation command, and establishes the delegation session based on the delegation command and the extracted user interface-related task information.
  • the delegated device sends a confirmation to the delegating device.
  • the delegating device and the delegated device may exchange data related to the user interface-related task using the connection established between the respective DLGMs on each device.
  • the delegation manager 206 A of the delegating device 202 A receives a request to delegate a user interface-related task to a delegated device 202 B.
  • a user may select a user interface component to indicate that an application 204 A on the delegating device 202 A may send a request to the DLGM 206 A.
  • the application may present a user interface element to a user via a graphical display.
  • the user interface element may include, but is not limited to, a button, a link, a key combination, or a gesture (e.g., swiping diagonally on a touch screen or right-clicking on an icon on a desktop), etc.
  • the application may cause a list of devices to be displayed that may act as delegated devices.
  • the user may select a device by, for example, clicking on an item in the list, where the item represents a device that may act as a delegated device.
  • the application 204 A may send a request to the DLGM 206 A to create a user interface-related task delegation command with an indication of the device selected to act as the delegated device 202 B.
  • the indication of the device may include, but is not limited to, an IP address, a MAC address, a unique identifier, etc.
  • the delegating device may delegate multiple user interface-related tasks at the same time.
  • multiple user interface-related tasks may he delegated to a single device, such as a personal computer
  • a user interface-related ask may be delegated to multiple delegated devices at the same time.
  • display viewing of streamed audio and or video may be delegated to multiple display devices at the same time.
  • the application may allow the user to select a specific delegated device 202 B using a mouse to click on an object including, but not limited to, a text document, an image, an audio file, etc.
  • a contextual menu may be displayed in response to the click.
  • the contextual menu may list devices that may act as delegated devices for the selected object. The user may select a delegated device by clicking on an option in the contextual menu. In response to the user selection, the selection and the object may he sent to the DLGM 206 A.
  • the application may allow the user to select a specific delegated device 202 B by swiping in a particular direction. For example, if a user is viewing a video, the user may swipe to the right over the video. The application may determine based on, for example, the direction of the swipe, that the user wishes to delegate the video to a television that has been previously paired with the delegating device 202 A. If, however, the user swipes to the left over the video, the application may send a request to the DLGM 206 A to delegate the video to a desktop computer that has been previously paired with the delegating device. The application may indicate the available devices by, for example, displaying an icon for the available delegated devices. The icons may be laid out so that a user may swipe towards the icon to indicate a selection of the delegated device represented by the icon.
  • the selection of delegated devices may he limited based on the type of delegation or the permissions granted to an application or user. Accordingly, only a subset of all otherwise available delegated devices may be presented to the user for selection based on the type of delegation or the permissions granted to the application or user. For example, if a user is viewing a video on a mobile device, the application may have permission to delegate the video to a television screen. However, the application may be prevented from sending the same video to a desktop computer, based on, for example, security permissions associated with the video content.
  • the DLGM 206 A may receive the request from the application and determine if the delegating device 202 A and the selected delegated device 202 B are already paired. For example, the DLGM 206 A may maintain a list of devices with which the delegating device 202 A has been authenticated and paired (i.e., devices that the delegating device 202 A has a previously established relationship). If the DLGM 206 A determines that the devices are not paired by, for example, not finding the delegated device in a relationship table, the delegating device 202 A and the delegated device 202 B may be paired and authenticated in the manner described below. Once the delegating device 202 A and the delegated device 202 B have established, a relationship by authenticating and pairing with each other, the DLGM 206 A may send a delegation command to the DLGM 206 B on delegated device 202 B.
  • the DLGM 206 A sends a delegation command to the DLGM 206 B.
  • the DLGM 206 A may create a delegation command based on the delegation request received in operation 302 .
  • the delegation command may include information related to the user interface-related task to be delegated, such as, but not limited to an application identifier, the type of user interface-related task requested to be delegated, data associated with the delegation request, etc.
  • the delegation command may be sent using network communication components such as wired or wireless transceivers.
  • the delegated device 202 B receives the delegation command.
  • the DLGM 206 B may determine that the delegation command has been received from a delegating device 202 A with which the delegated device 202 B has a relationship with. If a relationship exists, the DLGM 206 B may establish a delegation session. If, however, no relationship exists, the DLGM 206 B can, for example, reject the delegation command or initiate a pairing and authentication request, as described below.
  • the DLGM 206 B extracts user interface-related task information from the delegation command.
  • the user interface-related task it may include information such as, but not limited to, the type of delegation task open an independent application, display content on the delegated device, edit content, receive user input, or receive user input, etc.), configuration settings for the user interface-related task or for a user interface associated with the user interface-related task (e.g., a customized toolbar or other options/settings), etc.
  • the user interface-related task information may include an XML file with layout information for a user interface associated with the user interface-related task.
  • the delegated device 202 B may determine whether it is able to fulfill the delegation command.
  • the request may be rejected by the delegated device 202 B if no keyboard is connected.
  • the delegated device 202 B may perform the user interface-related task, then the DLGM 206 B may initiate a delegation session.
  • the delegating device 202 A may transfer data to the delegated device 202 B associated with the user interface-related task.
  • the transferred data may include, but is not limited to, data to re-create a user interface associated with the user interface-related task on the delegated device 202 B and content for display on the delegated device 202 B.
  • the DLGM 206 B establishes a delegation session based on the delegation command and the extracted user interface-related task information
  • the DLGM 206 B may create an instance of the delegation server 208 B. Once the delegation server 208 B has been created, the DLGM 206 B may send an acknowledgement to the DLGM 206 A on the delegating device 202 A. The delegation server 208 B may receive requests from the delegating device 202 A that sent the delegation command to the DLGM 206 B. Thus, if more than one application delegates a user interface-related task to the delegated device 202 B, then multiple respective delegation servers 208 B may be created to handle respective requests 214 related to each delegation session.
  • the DLGM 206 B may contain a single instance of a delegation server 208 B that may handle requests 214 from multiple delegation sessions.
  • the delegation server 208 B may receive a request from a delegation client 210 A, The delegation server 208 B may determine if the DLGM 206 B accepted a delegation command from the DLGM 206 A that created the delegation diem 210 A. If the delegation command was accepted, then the delegation server 208 B may perform tasks associated with requests sent by the delegation client 210 A. If however, the request was received from a delegation client associated with a refused delegation command or a delegation command that has been closed or canceled, then the request may be ignored or the delegation client may be informed that the request failed.
  • the DLGM 206 B sends a confirmation to the DLGM 206 A indicating that the delegation session has been established.
  • the DLGM 206 B may determine that the delegated device 202 B may handle the delegated user interface-related task. Additionally, the DLGM 206 B may indicate that the delegated device 202 B is ready to receive requests 214 associated with the delegated user interface-related task.
  • the confirmation may also include configuration information such as, but not limited to, the address of the delegation server 208 B, commands available on the delegation server, the version of the delegation server 208 B, etc.
  • the DLGM 206 A receives the confirmation from. the DLGM 206 B.
  • the DLGM 206 A may create an instance of a delegation client 210 A based on the confirmation received from the DLGM 206 B.
  • the delegation client 210 A may be created and configured to send all requests 214 to a particular address or to more than one address.
  • the delegation client 210 A may be created and configured to send requests 214 in a particular format, based on, for example, the version of the delegation server 208 B or the available commands on the server 208 B.
  • the created delegation client 210 A may send and receive data associated with the delegated user interface to the delegation server 2083 on the delegated device 202 B,
  • Each delegated user interface-related task may have a delegation client 210 A associated with it.
  • each delegation client 210 A may be responsible for only one delegated user interface-related task.
  • a single delegation client 210 A may be created by the DLGM 206 A. Data associated with each delegated user interface-related task may be handled by a single delegation client 210 A. Thus, the delegation client 210 A may determine which delegation server 208 B to send a request 214 to based on, for example, the application with which the request 214 is associated. Once the delegation client 210 A at the delegating device 202 A and the delegation server 208 B on the delegated device 202 B are created, all subsequent transfer of data associated with the user interface-related task may he between the delegation client 210 A and the delegation server 208 B.
  • data associated with the delegated user interface-related task is exchanged between the delegation client 210 A and the delegation server 208 B.
  • the data associated with the delegated user interface-related task may include, but is not limited to, commands to open an application, data to be viewed, user interface elements, user inputs, etc.
  • data associated with content may be sent to the delegating server 208 B and received by the delegating client 210 A.
  • content may he sent to a delegated device, the delegated device may modify the content, and the modified content may be sent back to the delegating device 202 A for storage or further processing.
  • Data received by the delegation server 208 B may be sent to an application on the delegated device 202 B and data received by the delegating client 210 A may be sent to an application on the delegating device 202 A.
  • the type of data exchanged may depend on the type of user interface-related task that has been delegated to the delegated device 202 B. Example types of delegated user interface-related tasks are described in additional detail below.
  • Operation of the user interface-related task during the delegation session depends on the type of user interface-related task that is delegated. For example, in one embodiment, for a user interface-related task that involves editing a document or object, the original document or object is transferred to the delegated device at the beginning of the delegation session, edited at the delegated device, and the edited document or object is returned to the delegating device at the end of the delegation session. In another embodiment, where a user interface is also delegated, key codes associated with user interface inputs received at the user interface on the delegated device are transferred in real-time back to the delegating device which may effectuate the inputs. For example, if a user interface on the delegated device receives a command to add a character to a word document, this command is transferred to the delegating device which performs the addition.
  • a user interface associated with the user interface-related task on the application 204 A may include a user interface element which, when triggered, may send a command to the DLGM 206 A, which may close the session.
  • the user interface may include an element such as, but not limited to, a button, a link, a key combination, a gesture, etc. The user can, for example, trigger the user interface element by clicking on a button or carrying out the pre-defined gesture.
  • the DLGM 206 A upon receiving the close command, may send a request to the DLGM 206 B to close the session.
  • the DLGM 206 B may close the delegation server 208 B and send an acknowledgement to the DLGM 206 A.
  • DLGM 206 A may close the delegation client 210 A
  • the DLGM 206 B may initiate the close session request by sending a request to the DLGM 206 A.
  • the DLGM 206 A may close the delegation client 210 A and respond with an acknowledgement.
  • updated documents and/or objects may also be communicated between the DLGM 206 A and the DLGM 206 B,
  • the DLGM 206 B may receive the acknowledgement and close the delegation server 208 B.
  • FIG. 4 depicts a flow diagram of a process for establishing and closing a session between the delegated device 202 A and the delegating device 202 B in accordance with an illustrative embodiment. Additional, fewer, or different operations may be performed, depending on the embodiment.
  • a method 400 may be implemented on a computing device.
  • the method 400 is encoded on a computer readable medium that contains instructions that, when executed by a computing device, cause the computing device to perform operations of the method 400 .
  • the application 204 A operating on the delegating device 202 A sends a delegation request 402 to the DLGM 206 A requesting delegation of a user interface-related task.
  • the DLGM 206 A transfers the request 404 to the DLGM 206 B of the delegated device 202 B.
  • the DLGM 206 B of the delegated device 202 B sends an acknowledgement 406 to the DLGM 206 A of the delegating device 202 A.
  • the DLGM 206 B creates 408 a delegation server 208 B.
  • the delegation server 208 B opens an application 410 to perform the delegated user interface-related task.
  • the delegating DLGM 206 A creates 412 a delegation client 210 A.
  • the delegation client 210 A sends an acknowledgement 414 to the delegation server 208 A acknowledging establishment of a delegation session for the user interface-related task.
  • delegated session data is transferred 416 , 418 between the application 204 A and the application 204 B via communication between the delegation client 210 A and the delegation server 208 B.
  • data output from a delegated application 204 B is sent to the delegation server 208 B, the delegation server 208 B sends the output to the delegation client 210 A, and the delegation client 210 A sends the output to the delegating application 204 A.
  • the delegation session thereby allows the delegated user interface related-task to be performed on the delegated device 202 B.
  • the delegating application 204 A Upon receiving an instruction to close the delegation session, the delegating application 204 A sends a close command 420 to the DLGM 206 A.
  • the DLGM 206 A sends a dose command 422 to the DLGM 206 B.
  • the DLGM 206 B sends a close command 424 to the delegation server 208 B.
  • the delegation server 208 B sends a close command 428 to the delegated application 204 B.
  • the delegation server 208 B sends an acknowledgement 430 to the DLGM 206 B and closes.
  • the DLGM 206 B sends an acknowledgement 432 to the DLGM 206 A.
  • the DLGM 206 A sends a close command 434 to the delegation client 210 A.
  • the user interface-related task delegation method establishes a connection between a delegation client 210 A on the delegating device 202 A and a delegation server 208 B on the delegated device 202 B.
  • the established connection is used to exchange data associated with the delegated task. Based on the delegated task, different data may be sent. Furthermore, the life of the connection may be based Oh the type of the delegated user interface-related task.
  • the type of delegated user interface-related task may include, but is not limited to, an asynchronous open task, a show task, an edit task, a get keyboard task, a get pointer task, a get user interface task, etc. Each of these tasks is detailed below.
  • the asynchronous task may be used to open an independent asynchronous application on the delegated device 202 B to handle an object received from the delegating device 202 A.
  • the object may be, but is not limited to, a uniform resource locator (URL), a document, an image, etc.
  • the delegation client 210 A may send a request 214 to the delegation server 208 B, the request including the object to be opened on the delegated device 202 B.
  • the delegation server 208 B may receive the request 214 and open an independent asynchronous application to handle the object on the delegated device 202 B.
  • the delegation client 210 A may send a URL to the delegation server 208 B.
  • the delegation server 208 B may open an application 204 B on the delegated device 202 B and send the object to the opened application 204 B.
  • the application 204 B may be a web browser.
  • the browser may retrieve the contents of the URL and display the contents on the delegated device 202 B.
  • the asynchronous task may close the established connection after receiving the initial request.
  • the delegation server 208 B may send information to the delegation client 210 B related to the asynchronous task such as, but not limited to, the application process identifier, a browser window identifier, etc, before closing the established session.
  • the show task may be used to display content from the delegating device 202 A on the delegated device 202 B.
  • a user of the application 204 A may select an object, such as, but not limited to, a picture, a document, an audio file, etc., on the delegating device 202 A.
  • a picture may be selected and sent to the delegation client 210 A, which sends the picture to the delegation server 208 B.
  • the delegation server 208 B may open a view of the object on the delegated device 202 B.
  • the delegation client 210 A may send a new picture and the delegation server 208 B may display the new object in the view that was opened to display the first picture.
  • the show task keeps the established connection alive until the view is closed.
  • the edit task may be used to open received content for editing on the delegated device 202 B.
  • a user of the application 204 A may select an object, such as, but not limited to, a picture, a document, an audio file, etc., on the delegating device 202 A.
  • a document may be selected and sent to the delegation client 210 A, which sends the document to the delegation server 208 B.
  • the delegation server 208 B may open an application that may be used to edit the document on the delegated device 202 B.
  • the edited document may be sent back to the delegating device 202 A for storage.
  • the updated document may be automatically sent (without the need for a specific command from the user to perform such a transmission) back to the delegating device 202 A over the established connection and the delegating device 202 A may update the local copy of the document.
  • the delegation server 208 B may send the document to the delegation client 210 A.
  • the edit task keeps the established connection alive until the document that is being edited is closed.
  • the get keyboard task may be used to enter content using input devices at the delegated device 202 B.
  • a user may select an object to be edited on the delegating device 202 A.
  • the object may include, but is not limited to, a text-field, a word document, a picture, etc.
  • a user may select a text field to enter content.
  • the user may select a user interface element, indicating that the input devices at the delegated device 202 B should be used.
  • An identifier for the selected object may be sent to the delegation client 210 A, which sends the identifier to the delegation server 208 B.
  • the delegation server 208 B may begin listening for inputs of the type selected by the application 204 A.
  • the application may select an input type of a keyboard.
  • the delegation server 208 B may send each input at the delegated device 202 B to the delegation client 210 A.
  • the delegation client 210 A may send the received input to the application 204 A to update the selected object.
  • the established connection may remain open until the user indicates that no more input is needed. In another example, the connection may be terminated when the input field is no longer editable.
  • the get pointer task may be used to receive a pointer input from the delegated device 202 B.
  • the delegated device 202 B may include a pointer device such as, but not limited to, a touch pad, a mouse, directional keys, etc.
  • a user may select a user interface element, indicating that the pointer devices at the delegated device 202 B should be used.
  • the delegation server 208 B may begin listening for inputs of the type selected by the application 204 A. For example, the application may select an input type of a touchpad.
  • the delegation server 208 B may send each touchpad input at the delegated device 202 B to the delegation client 210 A.
  • the delegation client 210 A may send the received input to the application 204 A to update the location of the pointer in the application 204 A.
  • the established connection may remain open until the user indicates that no more input is needed.
  • the get user interface task may be used to display portions, of a user interface associated with the application 204 A on the delegated device 202 B.
  • the application 204 A may send a definition file to the delegation client 210 A.
  • the delegation client 210 A may send a request with the definition file to the delegation server 208 B.
  • the delegation server 208 B may open the application 204 B to display a user interface based on the received definition file.
  • a user may interact with the user interface displayed on the delegated device 202 B using the delegated device 202 B.
  • the delegation server 208 B may send the input to the delegation client 210 A.
  • the delegation client 210 A may send the received input to the application 204 A.
  • the established connection between the delegation client 210 A and the delegation server 208 B may remain open until the application 204 B is closed or until the delegated user interface is closed.
  • the delegation of user interface-related tasks occurs between paired devices.
  • the delegated device 202 B in, order for devices to he paired, should be authenticated and trusted by the user of the delegating device 202 A.
  • the delegated device 202 B may be authenticated at various times. For example, the delegated device may be authenticated by adding the delegated device 202 B to the list of paired devices.
  • the delegated device 202 B may also be authenticated the first time a delegation request is made or every time a delegation request is made.
  • the delegating device 202 A may send a request for a passphrase to the delegated device 202 B.
  • a passphrase may be entered on the delegated device 202 B and sent to the delegating device 202 A.
  • the delegating device 202 A may validate the passphrase and, if the passphrase is valid, the two devices may establish a connection to send and receive user interface tasks.
  • Additional authentication mechanisms may also be used as known to those of skill in the art. For example, a public key/private key pair may be used to authenticate the devices or a central server may be used to verify the identity of the other device.
  • a Bluetooth pairing mechanism may also be used. it will be understood by one skilled in the art that multiple authentication and pairing mechanisms may be employed to establish trust between the delegating device 202 A and the delegated device 202 B.
  • FIG. 5 is a block diagram of a computer system, which may be used to delegate a user interface-related task, in accordance with an illustrative implementation.
  • the computer system or computing device 500 may be used to implement the device 102 , the delegating device 202 A, and the delegated device 202 B.
  • the computing system 500 includes a bus 505 or other communication component for communicating information and a processor 510 or processing circuit coupled to the bus 505 for processing information.
  • the computing, system 500 may also include one or more processors 510 or processing circuits coupled to the bus for processing information.
  • the computing system 500 also includes main memory 515 , such as a random access memory (RAM) or other dynamic storage device, coupled to the bus 505 for storing information, and instructions to be executed by the processor 510 .
  • main memory 515 such as a random access memory (RAM) or other dynamic storage device
  • Main memory 515 may also be used for storing position information, temporary variables, or other intermediate information during execution of instructions by the processor 510 .
  • the computing system 500 may further include a read-only memory (ROM) 520 or other static storage device coupled to the bus 505 and configured to store static information and instructions for the processor 510 .
  • ROM read-only memory
  • a storage device 525 such as a solid state device, magnetic disk or optical disk, is coupled to the bus 505 for persistently storing information and instructions.
  • the computing system 500 may be coupled via the bus 505 to a display 535 , such as a liquid crystal display, or active matrix display, for displaying information to a user.
  • a display 535 such as a liquid crystal display, or active matrix display
  • An input device 530 such as a keyboard including alphanumeric and other keys, may be coupled to the bus 505 and configured to communicate information and command selections to the processor 510 .
  • the input device 530 has a touch screen display 535 .
  • the input device 530 may include a cursor control, such as a mouse, a trackball, or cursor direction keys, for communicating direction information and command selections to the processor 510 and for controlling cursor movement on the display 535 .
  • the processes described herein may be implemented by the computing system 500 in response to the processor 510 executing an arrangement of instructions contained in main memory 515 .
  • Such instructions may he read into main memory 515 from another computer-readable medium, such as the storage device 525 .
  • Execution of the arrangement of instructions contained in main memory 515 causes the computing system 500 to perform the illustrative processes described herein.
  • One or more processors in a multi-processing arrangement may also be employed to execute the instructions contained in main memory 515 .
  • hard-wired circuitry may be used in place of or in combination with software instructions to effect illustrative implementations. Thus, implementations are not limited to any specific combination of hardware circuitry and software.
  • implementations described in this specification may be implemented in other types of digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • An example scenario of delegating a user interface-related task is as follows: A user receives an email on her smartphone. She opens a mail application on the smartphone and reads the email. She sees that there is a document attached to the email. Because the user is near her desktop computer, she delegates editing the document to her desktop computer. The document is opened using a compatible application on the desktop computer, where the user may read, edit, and save the document. The user may then determine that she wants to respond to the email. She may click on a button on her smartphone, thereby delegating the reply screen to her desktop. In this way, she may compose an email, attach the edited document, and send the email via a user interface on the desktop.
  • a second example scenario of delegating a user interface-related task is as follows: A user receives a text message on her smartphone.
  • the text message contains a link to a web page.
  • the user taps the link to the web page.
  • the web page opens on the smartphone browser. Because the web page is not easy to read, the user clicks on a button to send the link to a delegated device, such as her tablet.
  • the user accesses her tablet and sees that the web page indicated by the URL is displayed in a web browser on the tablet.
  • the user decides that she wants to respond to the text message. She returns to her smartphone and swipes on the text field to delegate the tablet to act as a keyboard for entry of content into the text field.
  • the tablet opens a comfortable touch keyboard all over the screen and the user may type the reply.
  • the message may be viewed on the smartphone screen while the user is typing.
  • she may press send on the tablet, the smartphone sends the message, and the keyboard on the tablet is closed.
  • a third example scenario of delegating a user interface-related task is as follows: A user is working on her desktop when she comes across a video clip on a website. She wants to show the video clip to her family members, so she right clicks on an icon for the video clip and is prompted by the desktop to select one or a plurality of possible delegation devices. The user selects a previously-paired television which prompts the delegation of playback of the video clip to her television in the living room and calls her kids to see the video on the television.
  • the video clip is quite long, so one of her children decides to delegate the open video clip to his music player, which may also play videos, and watch the remaining portion. of the video on the music player. In the meantime, the television may continue to play the video as well.
  • Implementations described in this specification may be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • the implementations described in this specification may be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on one or more computer storage media for execution by, or to control the operation of, data processing apparatus.
  • the program instructions may be encoded on au artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus.
  • a computer storage medium may be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them.
  • a computer storage medium is riot a propagated signal, a computer storage medium may be a source or destination of computer program instructions encoded in an artificially generated propagated signal.
  • the computer storage medium may also be, or be included in, one or more separate components or media (e.g., multiple CDs, disks, or other storage devices). Accordingly the computer storage medium is both tangible and non-transitory.
  • the term “data processing apparatus” or “computing device” encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations of the foregoing.
  • the apparatus may include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • the apparatus may also include, in addition to hardware, code that creates an execution environment for the computer program in question e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
  • the apparatus and execution environment may realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
  • a computer program (also known as a program, software, software application, script, or code) may be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it may be deployed in any form, including as a stand alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment.
  • a computer program may, but need not, correspond to a file in a file system.
  • a program may be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or m multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
  • a computer program may be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • a computer need not have such devices.
  • a computer may he embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), to name just a few.
  • Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD-ROM and DVD-ROM disks.
  • the processor and the memory may be supplemented by, or incorporated in, special purpose logic circuitry.
  • any two components so associated may also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated may also be viewed as being “operably couplable”, to each other to achieve the desired functionality.
  • operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present technology provides an illustrative method for delegating user interface-related tasks. In some examples, the method includes receiving a request to delegate a user interface-related task to a delegated computing device, and causing the transceiver of the delegating device to send a delegation command to the delegated computing device to establish a delegation session on the delegated computing device. The delegation command requests establishment of a delegated user interface to facilitate performance of the user interface-related task on the delegated computing device, and the delegated user interface is substantially similar to the user interface of the delegating device. The method also includes exchanging data associated with the user interface-related task with the delegated computing device.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation application under 35 U.S.C. §120 of U.S. application Ser. No. 14/342,377 filed on Mar. 1, 2014, which is a U.S. National Stage filing under 35 U.S.C. §371 of International Application No. PCT/US2013/047260, filed on Jun. 24, 2013, entitled “USER INTERFACE DELEGATION TO DELEGATED DEVICE.” International Application No. PCT/US2013/047260 and U.S. application Ser. No. 14/342,377, including any appendices or attachments thereof, are hereby incorporated by reference in their entirety.
  • BACKGROUND
  • The following description is provided to assist the understanding of the reader. None of the information provided or references cited is admitted to be prior art.
  • Today, users consume and create content using multiple devices. For example, users may read mails, watch videos, and edit documents on mobile phones, tablets, laptops, and desktop computers. The form factors of the devices, however, make some devices more suitable than others for carrying out certain tasks. For example, a user may read a document on a mobile device but prefer to edit the document on a larger screen.
  • Existing methods for transitioning from a first device to a second device may be complex. For example, a user may be required to make a document available on the second device. In one example, if a user wishes to access an email on a second device, the user may be required to log into the email account associated with the email on the second device before accessing the email on the second device.
  • SUMMARY
  • According to some examples, the present technology provides an illustrative delegating device. The delegating device includes a user interface configured to receive an input and a transceiver configured to send and receive data. The delegating device further includes a memory configured to store computer-executable instructions and a processor configured to execute the computer-executable instructions to perform various operations. Such operations include receiving a request to delegate a user interface related task to a delegated computing device, and causing the transceiver to send a delegation command to the delegated, computing device to establish a delegation session on the delegated computing device. The delegation command requests establishment of a delegated user interface to facilitate performance of the user interface related task on the delegated computing device, and the delegated user interface is substantially similar to the user interface of the delegating device. The operations further include exchanging data associated with the user interface-related task with the delegated computing device.
  • According to some examples, the present technology also provides an illustrative delegated device. The delegated device includes a user interface configured to receive a user input and a transceiver configured to send and receive data. The delegated device also includes a memory configured to store computer-executable instructions and a processor configured to execute the computer-executable instructions. Execution of the computer-executable instructions causes the delegated device to perform operations including receiving a delegation command from a delegating computing device to establish a delegation session, wherein the delegation command requests establishment of a user interface related task on the delegated device, and wherein the user interface related task was initiated at the delegating computing device. Execution of the computer-executable instructions also causes the delegated device to exchange data associated with the user interface-related task with the delegated computing device.
  • According to some examples, the present technology provides an illustrative method for delegating user interface related tasks. The method includes receiving a request to delegate a user interface related task to a delegated computing device. The method also includes causing the transceiver of the delegating device to send a delegation command to the delegated computing device to establish a delegation session on the delegated computing device. The delegation command requests establishment of a delegated user interface to facilitate performance of the user interface related task on the delegated computing device, and the delegated user interface is substantially similar to the user interface of the delegating device. The method further includes exchanging data associated with the user interface-related task with the delegated computing device.
  • According to some examples, the present technology further provides another illustrative method for delegating user interface related tasks. The method includes receiving, at a delegated computing device, a delegation command from a delegating computing device to establish a delegation session. The delegation command requests establishment of a user interface related task on the delegated computing device, and the user interface related task was initiated at the delegating computing device. The method further includes exchanging, by the delegated computing device, data associated with the user interface-related task with the delegated computing device, and automatically updating, by the delegated computing, device, the delegating computing device with actions performed at the delegated computing device corresponding to the user interface related task.
  • The preceding summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The foregoing and other features of the present disclosure will become more full apparent from the following description and appended claims, taken in conjunction with the accompanying drawings. Understanding that these drawings depict only several embodiments in accordance with the disclosure and are, therefore, not to be considered limiting of its scope, the disclosure will be described with additional specificity and detail through use of the accompanying drawings.
  • FIG. 1 depicts a device having a delegating manager in accordance with an illustrative embodiment.
  • FIG. 2 depicts a system where user interface-related task delegation may be implemented in accordance with an illustrative embodiment.
  • FIG. 3 depicts a flow diagram of a process for delegating a user interface-related task to a delegated device in accordance with an illustrative embodiment.
  • FIG. 4 depicts a flow diagram of a process for establishing and closing a session between a delegated device and a delegating device in accordance with an illustrative embodiment.
  • FIG. 5 depicts a block diagram of a computer system, which may be used to delegate a user interface-related task in accordance with an illustrative embodiment.
  • DETAILED DESCRIPTION
  • In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, may be arranged, substituted, combined, and designed in a wide variety of different configurations, all of which are explicitly contemplated and make part of this disclosure.
  • Described herein are technologies, including illustrative systems and methods for implementing a delegation of a user interface-related task from an initiating device (“the delegating device”) to another device (“the delegated device”). In one embodiment, user interface-related tasks include tasks that require user attendance or attention to be performed successfully. Examples of such user attendance or attention may include viewing a display, viewing a web page, listening to streamed audio. Additional examples of user attendance or attention include manipulation of a user interface which by way of example may include scrolling a screen, pausing or playing an audio or video stream, editing a file, selecting objects from a menu, selecting buttons on an interface, speaking to a device, or any other suitable user interface manipulation actions. In one embodiment, delegation of the user interface-related task involves delegation of a user interface from the delegating device to the delegated device. In another embodiment, a delegated device may use a pre-existing user interface to enable performance of the user interface-related task.
  • In a further embodiment, a user interface-related task may be delegated to the delegated device such that no further actions are required by the delegating device for performance of the user interface-related task. An example of such a user interface-related task may include transferring the display/viewing of a video stream front the delegating device to the delegated device. In still a further embodiment, a delegated device may manage a user interface associated with the user interface-related task and after completion of the user interface-related task may send appropriate documents and/or instructions back to the delegating device. For example, the user interface-related task may involve editing a file, and the delegated device may control such editing. After completion oldie editing, the delegated device may transfer the edited file back to the delegating device.
  • In still another embodiment, delegation of the user interface-related task may involve delegation of at least a portion of a user interface associated with the user interface-related task such that a portion of the user interface-related task is performed on the delegated device while another portion of the user interface-related task continues to be performed by the delegating device. For example, a smartphone (i.e., the delegating device) may delegate text entry to a personal computer (i.e., the delegated device) having a full keyboard while retaining the display functions and/or the actual file modification functions on the smartphone. According to such an embodiment, the text entry selections from the personal computer may be transferred to the smartphone, which processes the text entry selections and displays the modifications to the file to the user.
  • The delegating and delegated devices may communicate with each other using existing communication technologies including, but not limited to, Bluetooth, Infrared Data Association (“IRDA”), Near Field Communication (“NFC”), Local Area Networks (“LAN”), Wi-Fi, Wide Area Networks (“WAN”), etc. Additionally, the delegating and delegated devices may communicate using a web-based delegation connection server. User interfaces and/or user interface-related tasks may be sent over established delegation sessions between the delegating device and the delegated device. Delegation of a user interface-related task may include delegation of a specific task (e.g., rendering of a video, editing a document, editing or presenting data objects, editing email, opening a new file or webpage, etc.). In addition, delegation of the user interface-relate task may include communication to the delegated device of data objects, a customized toolbar or interface for a user interface-related task, and configuration settings fir a user interface-related task.
  • Delegating a user interface-related task allows a user to initiate the user interface-related task on the delegating device and continue execution of the user interface-related task on the delegated device. In one embodiment, delegation of a user interface-related task may include delegation of a user interface associated with the user interface-related task. For example, a user may initially access an email on a smart phone, delegate the email user interface to a tablet, and continue reading or responding to the email on the tablet. In another example, a streaming video may initially be accessed using a desktop computer. A video playback user interface and a task of rendering the video may be delegated to a tablet, such that the user may continue to view the video on the tablet. In another example, the user may delegate only a user interface associated with controls for the video playback (i.e., a video playback control user interface) to the tablet. In this way, the user may control playback via, the video playback control user interface on the table while continuing to watch the video on the desktop computer. In yet another example, a user may receive a text message on a smart phone. The user may delegate a user interface-related task associated with responding to the text message to a laptop, such that the user may respond to the text message using the keyboard of the laptop. This delegation may include the delegation of a user interface usable for responding to the text message.
  • FIG. 1 depicts a device having a delegating manager in accordance with an illustrative embodiment. As depicted in FIG. 1, a device 102 includes a software stack having an application layer 104, a device drivers and services layer 106, an operating system (“OS”) kernel layer 108, and a hardware layer 110.
  • The application layer 104 may include a number of applications that may be run on the device. For example, a user may access applications 112 and 114 on the device 102. The applications may include, but are not limited to, an email application, a video playback application, a music player application, etc.
  • The device drivers and services layer 106 allows access to hardware and service components by the application layer 104. In one embodiment, one such service may be the delegation manager software (“DLGM”) 116. The delegation manager software 116 may receive user in task delegation requests from the application layer 104. The DLGM 116 may be configured to initiate a point-to-point session between the delegating device and the delegated device. The DLGM 116 includes a delegation server 118 and a delegation client 120. When the device 102 is the delegating device in a session, the DLGM 116 may be configured to create a delegation client 120. When the device 102 is the delegated device in a session, the DLGM 116 may be configured to create a delegation server 118. As discussed in greater detail below, the delegation client 120 of the delegating device communicates with the delegation server 118 of the delegated device.
  • In a further embodiment, the device 102 may serve as a delegating device and a delegated device at different times or at the same time. Thus, the DLGM 116 may create a multiple number of delegation clients 120 and a multiple number of delegation servers 118. For example, a device 102 may delegate a user, interface-related task associated with application 112 to a first delegated device and a user interlace-related task associated with application 114 to a second delegated device. Thus, the device 102 may create two delegation clients 120, such that one is connected to the delegation server 118 of the first delegated device (e.g., the device that is delegated the user interface-related task associated with application 112) and the second delegation client is connected to the delegation server 118 of the second delegated device (e.g., the device that is delegated the user interface-related task associated with application 114). Additionally, the device 102 may also be delegated a user interface-related task from a second delegating device. Therefore, the device 102 may also create a delegation server 118 to receive requests from the second delegating device.
  • The OS kernel layer 108 may serve as an abstraction layer allowing access to hardware components. In one embodiment, the OS kernel layer 108 may be configured to allow the application layer 104 and the device drivers and services layer 106 to run on device 102 with different components without the need to be re-coded. Thus, an application may be programmed once to run on an operating system and deployed on multiple devices 102 that am the same OS kernel 108 but have different hardware components.
  • The hardware layer 110 may include the hardware components of the device 102. These components may include network communication devices, output and input devices such as, but not limited to, monitors, keyboards, touch screens, etc. In one embodiment, the application layer 104 and the device drivers and services layer 106 may access hardware components by interacting with the OS kernel layer 108. Thus, for example, the DLGM 116 may establish a connection with the DLGM of a second device over a network communications device that is part of the hardware layer 110 of the device 102.
  • FIG. 2 depicts a system 200 where user interface-related task delegation may be implemented in accordance with an illustrative embodiment. Delegating device 202A and delegated device 202B may include a delegation manager 206A and a delegation manager 206B, respectively. In one embodiment, an application 204A may request that a user interface-related task be delegated to another device. A delegated device 202B may be selected in one of several ways, including, but not limited to, searching for the delegated device 202B on a web-based delegation connection server, searching previously selected devices, and broadcasting a request over a personal area network. For example, the application 204A may search a web-based delegation connection server for a device to delegate the user interface-related task. If the delegating device 202A and the delegated device 202B have not been previously paired, the devices may authenticate each other, as described below. In another embodiment, the devices 202A and 202B may be required to pair each time a delegation request is initiated. One skilled in the art will understand that a number of security and authentication frameworks may be employed when pairing the devices. The user interface-related task delegation request and the delegated device 202B selection may be sent to the DLGM 206A.
  • The DLGM 206A may receive a request from an application 204A to delegate the user interface-related task to a delegated device 202B. In one embodiment, the DLGM 206A may send a delegation command to the DLGM 206B of the delegated device 202B. The delegated device 202B may create a delegation server 208B and send an acknowledgment to the delegating device 202A. The DLGM 206A may create a delegation client 210A and initiate a session with the delegation server 208B of delegation manager 206B of delegated device 202B. Once the session between the devices has been created, the user interface-related task may be sent to the delegated device 202B over the session. The session may be dosed by either device.
  • Each device may be setup as a delegating device, a delegated device, or both. In one embodiment, the system may be symmetric in that both the delegating device 202A and the delegated device 202B may be capable of acting as both a delegating and a delegated device. Optionally, one or both of the delegating device 202A and the delegated device 202B may be both a delegating device and a delegated device simultaneously. Continuing the example above, the delegated device 202B may send a user interface-related task delegation request to delegating device 202A. Because the devices are already paired, delegating device 202A may start a delegation server 208A and the delegated device 202B may start a delegation client 210B. Thus, devices 202A and 202B may send and receive user interface-related task delegation requests 212, 214 at the same time.
  • Establishing a Delegated User Interface
  • FIG. 3 depicts a flow diagram of a process for delegating a user interface-related task to a delegated device in accordance with an illustrative embodiment. Additional, fewer, or different operations may be performed, depending on the embodiment. A method 300 may be implemented on a computing device. In one embodiment, the method 300 is encoded on a computer readable medium that contains instructions that, when executed by a computing device, cause the computing device to perform operations of the method 300.
  • In one embodiment of the method 300, as described below, the delegating device receives a user interface-related task delegation request from an application 204A on a delegating device 202A. The delegating device sends a delegation command to the delegated device 202B. The delegated device 202B receives the delegation command, extracts user interface-related task information from the delegation command, and establishes the delegation session based on the delegation command and the extracted user interface-related task information. The delegated device sends a confirmation to the delegating device. Once the connection is set up, the delegating device and the delegated device may exchange data related to the user interface-related task using the connection established between the respective DLGMs on each device.
  • In an operation 302, the delegation manager 206A of the delegating device 202A receives a request to delegate a user interface-related task to a delegated device 202B. In one embodiment, a user may select a user interface component to indicate that an application 204A on the delegating device 202A may send a request to the DLGM 206A. For example, the application may present a user interface element to a user via a graphical display. The user interface element may include, but is not limited to, a button, a link, a key combination, or a gesture (e.g., swiping diagonally on a touch screen or right-clicking on an icon on a desktop), etc. If the user triggers the user interface element by, for example, clicking on a button, pressing the key combination, or gesturing in a pre-defined manner, the application may cause a list of devices to be displayed that may act as delegated devices. The user may select a device by, for example, clicking on an item in the list, where the item represents a device that may act as a delegated device. Upon receiving the selection by the user, the application 204A may send a request to the DLGM 206A to create a user interface-related task delegation command with an indication of the device selected to act as the delegated device 202B. The indication of the device may include, but is not limited to, an IP address, a MAC address, a unique identifier, etc.
  • In one embodiment, the delegating device may delegate multiple user interface-related tasks at the same time. For example, multiple user interface-related tasks may he delegated to a single device, such as a personal computer, in another embodiment, a user interface-related ask may be delegated to multiple delegated devices at the same time. For example, display viewing of streamed audio and or video may be delegated to multiple display devices at the same time.
  • In another embodiment, the application may allow the user to select a specific delegated device 202B using a mouse to click on an object including, but not limited to, a text document, an image, an audio file, etc. A contextual menu may be displayed in response to the click. In one embodiment, the contextual menu may list devices that may act as delegated devices for the selected object. The user may select a delegated device by clicking on an option in the contextual menu. In response to the user selection, the selection and the object may he sent to the DLGM 206A.
  • In yet another example, the application may allow the user to select a specific delegated device 202B by swiping in a particular direction. For example, if a user is viewing a video, the user may swipe to the right over the video. The application may determine based on, for example, the direction of the swipe, that the user wishes to delegate the video to a television that has been previously paired with the delegating device 202A. If, however, the user swipes to the left over the video, the application may send a request to the DLGM 206A to delegate the video to a desktop computer that has been previously paired with the delegating device. The application may indicate the available devices by, for example, displaying an icon for the available delegated devices. The icons may be laid out so that a user may swipe towards the icon to indicate a selection of the delegated device represented by the icon.
  • In a further embodiment, the selection of delegated devices may he limited based on the type of delegation or the permissions granted to an application or user. Accordingly, only a subset of all otherwise available delegated devices may be presented to the user for selection based on the type of delegation or the permissions granted to the application or user. For example, if a user is viewing a video on a mobile device, the application may have permission to delegate the video to a television screen. However, the application may be prevented from sending the same video to a desktop computer, based on, for example, security permissions associated with the video content.
  • The DLGM 206A may receive the request from the application and determine if the delegating device 202A and the selected delegated device 202B are already paired. For example, the DLGM 206A may maintain a list of devices with which the delegating device 202A has been authenticated and paired (i.e., devices that the delegating device 202A has a previously established relationship). If the DLGM 206A determines that the devices are not paired by, for example, not finding the delegated device in a relationship table, the delegating device 202A and the delegated device 202B may be paired and authenticated in the manner described below. Once the delegating device 202A and the delegated device 202B have established, a relationship by authenticating and pairing with each other, the DLGM 206A may send a delegation command to the DLGM 206B on delegated device 202B.
  • In an operation 304, the DLGM 206A sends a delegation command to the DLGM 206B. In one embodiment, the DLGM 206A may create a delegation command based on the delegation request received in operation 302. For example, the delegation command may include information related to the user interface-related task to be delegated, such as, but not limited to an application identifier, the type of user interface-related task requested to be delegated, data associated with the delegation request, etc. The delegation command may be sent using network communication components such as wired or wireless transceivers.
  • In an operation 306, the delegated device 202B receives the delegation command. In one embodiment, the DLGM 206B may determine that the delegation command has been received from a delegating device 202A with which the delegated device 202B has a relationship with. If a relationship exists, the DLGM 206B may establish a delegation session. If, however, no relationship exists, the DLGM 206B can, for example, reject the delegation command or initiate a pairing and authentication request, as described below.
  • In an operation 308, the DLGM 206B extracts user interface-related task information from the delegation command. In one embodiment, the user interface-related task it may include information such as, but not limited to, the type of delegation task open an independent application, display content on the delegated device, edit content, receive user input, or receive user input, etc.), configuration settings for the user interface-related task or for a user interface associated with the user interface-related task (e.g., a customized toolbar or other options/settings), etc. In another embodiment, the user interface-related task information may include an XML file with layout information for a user interface associated with the user interface-related task. The delegated device 202B may determine whether it is able to fulfill the delegation command. For example, if the user interface-related task to be delegated requires that a keyboard be connected to the delegated device 202B, the request may be rejected by the delegated device 202B if no keyboard is connected. If the delegated device 202B may perform the user interface-related task, then the DLGM 206B may initiate a delegation session. The delegating device 202A may transfer data to the delegated device 202B associated with the user interface-related task. The transferred data may include, but is not limited to, data to re-create a user interface associated with the user interface-related task on the delegated device 202B and content for display on the delegated device 202B.
  • In an operation 310, the DLGM 206B establishes a delegation session based on the delegation command and the extracted user interface-related task information, In one embodiment, the DLGM 206B may create an instance of the delegation server 208B. Once the delegation server 208B has been created, the DLGM 206B may send an acknowledgement to the DLGM 206A on the delegating device 202A. The delegation server 208B may receive requests from the delegating device 202A that sent the delegation command to the DLGM 206B. Thus, if more than one application delegates a user interface-related task to the delegated device 202B, then multiple respective delegation servers 208B may be created to handle respective requests 214 related to each delegation session.
  • In another embodiment, the DLGM 206B may contain a single instance of a delegation server 208B that may handle requests 214 from multiple delegation sessions. For example, the delegation server 208B may receive a request from a delegation client 210A, The delegation server 208B may determine if the DLGM 206B accepted a delegation command from the DLGM 206A that created the delegation diem 210A. If the delegation command was accepted, then the delegation server 208B may perform tasks associated with requests sent by the delegation client 210A. If however, the request was received from a delegation client associated with a refused delegation command or a delegation command that has been closed or canceled, then the request may be ignored or the delegation client may be informed that the request failed.
  • In an operation 312, the DLGM 206B sends a confirmation to the DLGM 206A indicating that the delegation session has been established. In one embodiment, the DLGM 206B may determine that the delegated device 202B may handle the delegated user interface-related task. Additionally, the DLGM 206B may indicate that the delegated device 202B is ready to receive requests 214 associated with the delegated user interface-related task. The confirmation may also include configuration information such as, but not limited to, the address of the delegation server 208B, commands available on the delegation server, the version of the delegation server 208B, etc.
  • In an operation 314, the DLGM 206A receives the confirmation from. the DLGM 206B. In one embodiment, the DLGM 206A may create an instance of a delegation client 210A based on the confirmation received from the DLGM 206B. For example, the delegation client 210A may be created and configured to send all requests 214 to a particular address or to more than one address. Additionally, the delegation client 210A may be created and configured to send requests 214 in a particular format, based on, for example, the version of the delegation server 208B or the available commands on the server 208B. The created delegation client 210A may send and receive data associated with the delegated user interface to the delegation server 2083 on the delegated device 202B, Each delegated user interface-related task may have a delegation client 210A associated with it. Thus, each delegation client 210A may be responsible for only one delegated user interface-related task.
  • In another embodiment, a single delegation client 210A may be created by the DLGM 206A. Data associated with each delegated user interface-related task may be handled by a single delegation client 210A. Thus, the delegation client 210A may determine which delegation server 208B to send a request 214 to based on, for example, the application with which the request 214 is associated. Once the delegation client 210A at the delegating device 202A and the delegation server 208B on the delegated device 202B are created, all subsequent transfer of data associated with the user interface-related task may he between the delegation client 210A and the delegation server 208B.
  • In operations 316A and 316B, data associated with the delegated user interface-related task is exchanged between the delegation client 210A and the delegation server 208B. In one embodiment, the data associated with the delegated user interface-related task may include, but is not limited to, commands to open an application, data to be viewed, user interface elements, user inputs, etc. Additionally, data associated with content may be sent to the delegating server 208B and received by the delegating client 210A. Thus, content may he sent to a delegated device, the delegated device may modify the content, and the modified content may be sent back to the delegating device 202A for storage or further processing. Data received by the delegation server 208B may be sent to an application on the delegated device 202B and data received by the delegating client 210A may be sent to an application on the delegating device 202A. The type of data exchanged may depend on the type of user interface-related task that has been delegated to the delegated device 202B. Example types of delegated user interface-related tasks are described in additional detail below.
  • Operation of the user interface-related task during the delegation session depends on the type of user interface-related task that is delegated. For example, in one embodiment, for a user interface-related task that involves editing a document or object, the original document or object is transferred to the delegated device at the beginning of the delegation session, edited at the delegated device, and the edited document or object is returned to the delegating device at the end of the delegation session. In another embodiment, where a user interface is also delegated, key codes associated with user interface inputs received at the user interface on the delegated device are transferred in real-time back to the delegating device which may effectuate the inputs. For example, if a user interface on the delegated device receives a command to add a character to a word document, this command is transferred to the delegating device which performs the addition.
  • Once established, the session between the delegation client 210A and the delegation server 208B may be later closed by either the delegating device 202A or the delegated device 202B. In one embodiment, a user interface associated with the user interface-related task on the application 204A (either at the delegating device 202A or the delegated device 202B) may include a user interface element which, when triggered, may send a command to the DLGM 206A, which may close the session. For example, the user interface may include an element such as, but not limited to, a button, a link, a key combination, a gesture, etc. The user can, for example, trigger the user interface element by clicking on a button or carrying out the pre-defined gesture. In one embodiment, the DLGM 206A, upon receiving the close command, may send a request to the DLGM 206B to close the session. The DLGM 206B may close the delegation server 208B and send an acknowledgement to the DLGM 206A. Upon receipt of the acknowledgement, DLGM 206A may close the delegation client 210A, in another example, the DLGM 206B may initiate the close session request by sending a request to the DLGM 206A. The DLGM 206A may close the delegation client 210A and respond with an acknowledgement. In one embodiment, updated documents and/or objects ma also be communicated between the DLGM 206A and the DLGM 206B, The DLGM 206B may receive the acknowledgement and close the delegation server 208B.
  • FIG. 4 depicts a flow diagram of a process for establishing and closing a session between the delegated device 202A and the delegating device 202B in accordance with an illustrative embodiment. Additional, fewer, or different operations may be performed, depending on the embodiment. A method 400 may be implemented on a computing device. In one embodiment, the method 400 is encoded on a computer readable medium that contains instructions that, when executed by a computing device, cause the computing device to perform operations of the method 400.
  • In one embodiment of the method 400, as described below, the application 204A operating on the delegating device 202A sends a delegation request 402 to the DLGM 206A requesting delegation of a user interface-related task. The DLGM 206A transfers the request 404 to the DLGM 206B of the delegated device 202B. The DLGM 206B of the delegated device 202B sends an acknowledgement 406 to the DLGM 206A of the delegating device 202A. The DLGM 206B creates 408 a delegation server 208B. The delegation server 208B opens an application 410 to perform the delegated user interface-related task. The delegating DLGM 206A creates 412 a delegation client 210A. The delegation client 210A sends an acknowledgement 414 to the delegation server 208A acknowledging establishment of a delegation session for the user interface-related task.
  • During the delegation session, delegated session data is transferred 416, 418 between the application 204A and the application 204B via communication between the delegation client 210A and the delegation server 208B. Far example, data output from a delegated application 204B is sent to the delegation server 208B, the delegation server 208B sends the output to the delegation client 210A, and the delegation client 210A sends the output to the delegating application 204A. The delegation session thereby allows the delegated user interface related-task to be performed on the delegated device 202B.
  • Upon receiving an instruction to close the delegation session, the delegating application 204A sends a close command 420 to the DLGM 206A. The DLGM 206A sends a dose command 422 to the DLGM 206B. The DLGM 206B sends a close command 424 to the delegation server 208B. The delegation server 208B sends a close command 428 to the delegated application 204B. The delegation server 208B sends an acknowledgement 430 to the DLGM 206B and closes. The DLGM 206B sends an acknowledgement 432 to the DLGM 206A. The DLGM 206A sends a close command 434 to the delegation client 210A.
  • Types of User Interface-Related Tasks for Delegation
  • The user interface-related task delegation method, discussed above, establishes a connection between a delegation client 210A on the delegating device 202A and a delegation server 208B on the delegated device 202B. The established connection is used to exchange data associated with the delegated task. Based on the delegated task, different data may be sent. Furthermore, the life of the connection may be based Oh the type of the delegated user interface-related task. The type of delegated user interface-related task may include, but is not limited to, an asynchronous open task, a show task, an edit task, a get keyboard task, a get pointer task, a get user interface task, etc. Each of these tasks is detailed below.
  • The asynchronous task may be used to open an independent asynchronous application on the delegated device 202B to handle an object received from the delegating device 202A. In one embodiment, the object may be, but is not limited to, a uniform resource locator (URL), a document, an image, etc. The delegation client 210A may send a request 214 to the delegation server 208B, the request including the object to be opened on the delegated device 202B. The delegation server 208B may receive the request 214 and open an independent asynchronous application to handle the object on the delegated device 202B. For example, the delegation client 210A may send a URL to the delegation server 208B. Upon receipt of the URL, the delegation server 208B may open an application 204B on the delegated device 202B and send the object to the opened application 204B. For example, the application 204B may be a web browser. Upon receiving the URL from the delegation server 208B, the browser may retrieve the contents of the URL and display the contents on the delegated device 202B. The asynchronous task may close the established connection after receiving the initial request. In another example, the delegation server 208B may send information to the delegation client 210B related to the asynchronous task such as, but not limited to, the application process identifier, a browser window identifier, etc, before closing the established session.
  • The show task may be used to display content from the delegating device 202A on the delegated device 202B. In one embodiment, a user of the application 204A may select an object, such as, but not limited to, a picture, a document, an audio file, etc., on the delegating device 202A. For example, a picture may be selected and sent to the delegation client 210A, which sends the picture to the delegation server 208B. The delegation server 208B may open a view of the object on the delegated device 202B. The delegation client 210A may send a new picture and the delegation server 208B may display the new object in the view that was opened to display the first picture. The show task keeps the established connection alive until the view is closed.
  • The edit task may be used to open received content for editing on the delegated device 202B. In one embodiment, a user of the application 204A may select an object, such as, but not limited to, a picture, a document, an audio file, etc., on the delegating device 202A. For example, a document may be selected and sent to the delegation client 210A, which sends the document to the delegation server 208B. The delegation server 208B may open an application that may be used to edit the document on the delegated device 202B. The edited document may be sent back to the delegating device 202A for storage. For example, each time a user indicates that the document should be saved, the updated document may be automatically sent (without the need for a specific command from the user to perform such a transmission) back to the delegating device 202A over the established connection and the delegating device 202A may update the local copy of the document. When the document is closed, the delegation server 208B may send the document to the delegation client 210A. The edit task keeps the established connection alive until the document that is being edited is closed.
  • The get keyboard task may be used to enter content using input devices at the delegated device 202B. In one embodiment, a user may select an object to be edited on the delegating device 202A. The object may include, but is not limited to, a text-field, a word document, a picture, etc. For example, a user may select a text field to enter content. The user may select a user interface element, indicating that the input devices at the delegated device 202B should be used. An identifier for the selected object may be sent to the delegation client 210A, which sends the identifier to the delegation server 208B. The delegation server 208B may begin listening for inputs of the type selected by the application 204A. For example, the application may select an input type of a keyboard. The delegation server 208B may send each input at the delegated device 202B to the delegation client 210A. The delegation client 210A may send the received input to the application 204A to update the selected object. The established connection may remain open until the user indicates that no more input is needed. In another example, the connection may be terminated when the input field is no longer editable.
  • The get pointer task may be used to receive a pointer input from the delegated device 202B. The delegated device 202B may include a pointer device such as, but not limited to, a touch pad, a mouse, directional keys, etc. In one embodiment, a user may select a user interface element, indicating that the pointer devices at the delegated device 202B should be used. The delegation server 208B may begin listening for inputs of the type selected by the application 204A. For example, the application may select an input type of a touchpad. The delegation server 208B may send each touchpad input at the delegated device 202B to the delegation client 210A. The delegation client 210A may send the received input to the application 204A to update the location of the pointer in the application 204A. The established connection may remain open until the user indicates that no more input is needed.
  • The get user interface task may be used to display portions, of a user interface associated with the application 204A on the delegated device 202B. The application 204A may send a definition file to the delegation client 210A. The delegation client 210A may send a request with the definition file to the delegation server 208B. The delegation server 208B may open the application 204B to display a user interface based on the received definition file. A user may interact with the user interface displayed on the delegated device 202B using the delegated device 202B. Each time the user interacts with the displayed user interface, the delegation server 208B may send the input to the delegation client 210A. The delegation client 210A may send the received input to the application 204A. The established connection between the delegation client 210A and the delegation server 208B may remain open until the application 204B is closed or until the delegated user interface is closed.
  • Pairing and Authentication of Devices
  • The delegation of user interface-related tasks occurs between paired devices. In one embodiment, in, order for devices to he paired, the delegated device 202B should be authenticated and trusted by the user of the delegating device 202A. The delegated device 202B may be authenticated at various times. For example, the delegated device may be authenticated by adding the delegated device 202B to the list of paired devices. The delegated device 202B may also be authenticated the first time a delegation request is made or every time a delegation request is made. In one embodiment, the delegating device 202A may send a request for a passphrase to the delegated device 202B. A passphrase may be entered on the delegated device 202B and sent to the delegating device 202A. The delegating device 202A may validate the passphrase and, if the passphrase is valid, the two devices may establish a connection to send and receive user interface tasks. Additional authentication mechanisms may also be used as known to those of skill in the art. For example, a public key/private key pair may be used to authenticate the devices or a central server may be used to verify the identity of the other device. In addition, a Bluetooth pairing mechanism may also be used. it will be understood by one skilled in the art that multiple authentication and pairing mechanisms may be employed to establish trust between the delegating device 202A and the delegated device 202B.
  • FIG. 5 is a block diagram of a computer system, which may be used to delegate a user interface-related task, in accordance with an illustrative implementation. The computer system or computing device 500 may be used to implement the device 102, the delegating device 202A, and the delegated device 202B. The computing system 500 includes a bus 505 or other communication component for communicating information and a processor 510 or processing circuit coupled to the bus 505 for processing information. The computing, system 500 may also include one or more processors 510 or processing circuits coupled to the bus for processing information. The computing system 500 also includes main memory 515, such as a random access memory (RAM) or other dynamic storage device, coupled to the bus 505 for storing information, and instructions to be executed by the processor 510. Main memory 515 may also be used for storing position information, temporary variables, or other intermediate information during execution of instructions by the processor 510. The computing system 500 may further include a read-only memory (ROM) 520 or other static storage device coupled to the bus 505 and configured to store static information and instructions for the processor 510. A storage device 525, such as a solid state device, magnetic disk or optical disk, is coupled to the bus 505 for persistently storing information and instructions.
  • The computing system 500 may be coupled via the bus 505 to a display 535, such as a liquid crystal display, or active matrix display, for displaying information to a user. An input device 530, such as a keyboard including alphanumeric and other keys, may be coupled to the bus 505 and configured to communicate information and command selections to the processor 510. In another implementation, the input device 530 has a touch screen display 535. The input device 530 may include a cursor control, such as a mouse, a trackball, or cursor direction keys, for communicating direction information and command selections to the processor 510 and for controlling cursor movement on the display 535.
  • According to various implementations, the processes described herein may be implemented by the computing system 500 in response to the processor 510 executing an arrangement of instructions contained in main memory 515. Such instructions may he read into main memory 515 from another computer-readable medium, such as the storage device 525. Execution of the arrangement of instructions contained in main memory 515 causes the computing system 500 to perform the illustrative processes described herein. One or more processors in a multi-processing arrangement may also be employed to execute the instructions contained in main memory 515. In alternative implementations, hard-wired circuitry may be used in place of or in combination with software instructions to effect illustrative implementations. Thus, implementations are not limited to any specific combination of hardware circuitry and software.
  • Although an example computing system has been described in FIG. 5, implementations described in this specification may be implemented in other types of digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • User Interface-Related Task Delegation Examples
  • An example scenario of delegating a user interface-related task is as follows: A user receives an email on her smartphone. She opens a mail application on the smartphone and reads the email. She sees that there is a document attached to the email. Because the user is near her desktop computer, she delegates editing the document to her desktop computer. The document is opened using a compatible application on the desktop computer, where the user may read, edit, and save the document. The user may then determine that she wants to respond to the email. She may click on a button on her smartphone, thereby delegating the reply screen to her desktop. In this way, she may compose an email, attach the edited document, and send the email via a user interface on the desktop.
  • A second example scenario of delegating a user interface-related task is as follows: A user receives a text message on her smartphone. The text message contains a link to a web page. The user taps the link to the web page. The web page opens on the smartphone browser. Because the web page is not easy to read, the user clicks on a button to send the link to a delegated device, such as her tablet. The user then accesses her tablet and sees that the web page indicated by the URL is displayed in a web browser on the tablet. The user decides that she wants to respond to the text message. She returns to her smartphone and swipes on the text field to delegate the tablet to act as a keyboard for entry of content into the text field. The tablet opens a comfortable touch keyboard all over the screen and the user may type the reply. The message may be viewed on the smartphone screen while the user is typing. When the user is finished, she may press send on the tablet, the smartphone sends the message, and the keyboard on the tablet is closed.
  • A third example scenario of delegating a user interface-related task is as follows: A user is working on her desktop when she comes across a video clip on a website. She wants to show the video clip to her family members, so she right clicks on an icon for the video clip and is prompted by the desktop to select one or a plurality of possible delegation devices. The user selects a previously-paired television which prompts the delegation of playback of the video clip to her television in the living room and calls her kids to see the video on the television. The video clip is quite long, so one of her children decides to delegate the open video clip to his music player, which may also play videos, and watch the remaining portion. of the video on the music player. In the meantime, the television may continue to play the video as well.
  • Implementations described in this specification may be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. The implementations described in this specification may be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on one or more computer storage media for execution by, or to control the operation of, data processing apparatus. Alternatively or in addition, the program instructions may be encoded on au artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus. A computer storage medium may be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Moreover, while a computer storage medium is riot a propagated signal, a computer storage medium may be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium may also be, or be included in, one or more separate components or media (e.g., multiple CDs, disks, or other storage devices). Accordingly the computer storage medium is both tangible and non-transitory.
  • The term “data processing apparatus” or “computing device” encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations of the foregoing. The apparatus may include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). The apparatus may also include, in addition to hardware, code that creates an execution environment for the computer program in question e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them. The apparatus and execution environment may realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
  • A computer program (also known as a program, software, software application, script, or code) may be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it may be deployed in any form, including as a stand alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program may be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or m multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program may be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer may he embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), to name just a few. Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory may be supplemented by, or incorporated in, special purpose logic circuitry.
  • While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any inventions or of what may be claimed, but rather as descriptions of features specific to particular implementations of particular inventions. Certain features described in this specification in the context of separate implementations may also be implemented in combination in a single implementation. Conversely, various features described in the context of a single implementation may also be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
  • Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems may generally be integrated in a single software product or packaged into multiple software products.
  • Thus, particular implementations of the invention have been described. Other implementations are within the scope of the following claims. In some cases, the actions recited in the claims may be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In certain implementations, multitasking and parallel processing may be advantageous.
  • One or more flow diagrams may have been used herein. The use of flow diagrams is not meant to he limiting with respect to the order of operations performed. The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be, understood that such depicted architectures are merely illustrative, and that in fact many other architectures may be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality may be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated may also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated may also be viewed as being “operably couplable”, to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
  • With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art may translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may he expressly set forth herein for sake of clarity.
  • It will be understood by those within the art that, in general, terms used herein, and, especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation, no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other, modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc,” is used, in general, such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc). In those instances where a convention analogous to “at least one of A, B. C, etc.” is used, in general, such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, ardor A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or, more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be understood to include the possibilities of “A” or “B” or “A and B.”
  • The foregoing description of illustrative embodiments has been presented for purposes of illustration and of description. It is not intended to be exhaustive or limiting with respect to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice oldie disclosed embodiments. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.

Claims (22)

What is claimed is:
1. A delegating device, comprising:
a user interface;
a transceiver; and
a processor communicatively coupled to the user interface and the transceiver, wherein the processor is configured to:
obtain a request, via the transceiver from an application, to delegate a user interface related task to a delegated device;
responsive to the obtained request, send, via the transceiver, a delegation command to perform the user interface related task to the delegated device, wherein the delegation command identifies at least a portion of the user interface of the delegating device to be established, as a delegated user interface, on a user interface of the delegated device, and wherein the at least the portion of the user interface is determined based on the application; and
responsive to an acknowledgement, from the delegated device, that the delegated device is ready to perform the user interface related task, exchange, via the transceiver, data, associated with the user interface related task, with the delegated device, wherein the exchange of the data includes at least transfer of the data to the delegated device to create the delegated user interface on the delegated device.
2. The delegating device of claim 1, wherein the exchange of the data enables the processor to automatically recognize actions, associated with the user interface related task, that are performed at the delegated device.
3. The delegating device of claim 1, wherein the delegating device is connected to the delegated device by a personal area network or a local area network.
4. The delegating device of claim 1, wherein the processor is further configured to:
prior to the delegation command being sent to the delegated device, determine whether the delegating device and the delegated device are paired; and
responsive to a determination that the delegating device and the delegated device are unpaired, authenticate the delegated device.
5. The delegating device of claim 4, wherein the processor is configured to authenticate the delegated device by addition of the delegated device in a list of paired devices.
6. The delegating device of claim 1, wherein to exchange the data, the processor is configured to send at least one of commands to open the application, data to be viewed, and user inputs.
7. The delegating device of claim 1, wherein die processor is further configured to:
responsive to creation of the delegated user interface on the delegated device, send, via the transceiver, content to the delegated device, and obtain, via the transceiver, modified content from the delegated device.
8. The delegating device of claim 1, wherein at least the portion of the user into of the delegating device is a first portion, wherein the delegated device is a first delegated device, and wherein the processor is further configured to;
send, via the transceiver, another delegation command to perform another user interface related task to a second delegated device, wherein the other delegation command identifies a second portion of the user interface of the delegating device to be established, as another delegated user interface, on a user interlace of the second delegated device, and wherein the second portion of the user interface is determined based on the application; and
responsive to an acknowledgement, from the second delegated device, that the second delegated device is ready to perform the other user interface related task, exchange, via the transceiver, another data, associated with the other user interface related task, with the second delegated device, wherein the exchange of the other data includes at least transfer of the other data to the second delegated device to create the other delegated user interface on the second delegated device.
9. The delegating device of claim 1, wherein the processor is further configured to:
obtain, via the transceiver, another delegation command from another device; and
responsive to the other delegation command being obtained, establish another delegated user interface at the delegated device, wherein the other delegated user interface is a replica of a user interface of the other device.
10. A delegation method, comprising:
receiving, by a delegating device, a request to delegate a user interface related task;
responsive to receipt of the request, selecting, by the delegating device, a delegated device to which the user interface related task is to be delegated;
sending, by the delegating device, a delegation command to perform the user the interface task to the selected delegated device, wherein the delegation command identifies at least a portion of a user interface of the delegating device to be established, as a delegated user interface, on a user interface of the delegated device, and wherein the at least the portion of the user interface is determined based on an application that runs on the delegating device; and
responsive to an acknowledgement, from the delegated device, that the delegated device is ready to perform the user interface related task, exchanging, by the delegating device, data, associated with the user interface related task, with the delegated device, wherein the exchange of the data includes at least transfer of the data to the delegated device to create the delegated user interface on the delegated device.
11. The delegation method of claim 10, wherein selecting the delegated device comprises selecting the delegated device by one of searching a web-based delegation connection server, searching previously delegated devices, and broadcasting a request over a personal area network.
12. The delegation method of claim 10, wherein selecting the delegated device comprises selecting the delegated device based on at least one of a type of the user interface related task, and security permissions associated with the application.
13. The delegation method of claim 10, wherein selecting the delegated device comprises selecting the delegated device based on an input on the user interface of the delegated device, and wherein the input includes a swipe in a particular direction on the user interface of the delegated device.
14. The delegation method of claim 10, wherein receiving the request to delegate the user interface related task comprises receiving the request to delegate one of: an asynchronous open task, a show task, an edit task, a get keyboard task, a get pointer task, and a get user interface task.
15. The delegation method of claim 10, further comprising
responsive to creation of the delegated user interface on the delegated device, sending content to the delegated device, and receiving modified content from the delegated device.
16. A delegation method, comprising:
receiving, by a delegated device from a delegating device, a delegation command to perform a user interface related task, wherein the delegation command identifies at least a portion of a user interface of the delegating device to be established, as a delegated user interface, on a user interface of the delegated device, and wherein the at least the portion of the user interface is associated with an application that runs on the delegating device;
extracting, by the delegated device, information, associated the user interface related task, from the delegation command, to determine whether the delegated device is able to perform the user interface related task;
responsive to a determination that the delegated device is able to perform the user interface related task, sending, by the delegated device, an acknowledgement to the delegating device, wherein the acknowledgement indicates that the delegated device is ready to receive requests associated with the user interface related task; and
responsive to the acknowledgement being sent, receiving, by the delegated device from the delegating device, data to create the delegated user interface on the delegated device, and content to be displayed on the delegated user interface, wherein the delegated user interface is a replica of the at least the portion of the user interface of the delegating device.
17. The method of claim 16, further comprising:
sending, by the delegated device to the delegating device, a summary of actions, associated with the user interface related task, that are performed at the delegated device.
18. The method of claim 16, thither comprising:
establishing, by the delegated device, another delegated user interface on the delegated device, responsive to receipt of another delegation command from another delegating device.
19. The method of claim 16, further comprising:
modifying, by the delegated device, content received from the delegating device, wherein the content is associated with the user interface related task; and
sending, by the delegated device, the modified content to the delegating device.
20. The method of claim 16, further comprising:
responsive to receipt of the delegation command from the delegating device, determining, by the delegated device, whether the delegating device and the delegated device are related to each other; and
responsive to a determination that the delegating device and the delegated device are unrelated to each other, initiating, by the delegated device, pairing with the delegating device.
21. A non-transitory computer-readable medium having instructions stored thereon that, in response to execution by a first device, cause the first device to perform or control performance of operations to:
obtain a request, from an application, to delegate a user interface related task to a second device;
responsive to the obtained request, send a delegation command to perform the user interface related task to the second device, wherein the delegation command identifies at least a portion of the user interface of the first device to be established, as a delegated user interface, on a user interface of the second device, and wherein the at least the portion of the user interface is determined based on the application; and
responsive to an acknowledgement, from the second device, that the second device is ready to perform the user interlace related task, exchange data, associated with the user interface related task, with the second device, wherein the exchange of the data includes at least transfer of the data to the second device to create the delegated user interface on the second device.
22. A non-transitory computer-readable medium having instructions stored thereon that, in response to execution by a first device, cause the first device to perform or control performance of operations to:
receive, from a second device, a delegation command to perform a user interface related task, wherein the delegation command identifies at least a portion of a user interface of the second device to be established, as a delegated user interface, on a user interface of the first device, and wherein the at least the portion of the user interface is associated with an application that runs on the second device;
extract information, associated the user interface related task, from the delegation command, to determine whether the first device is able to perform the user interface related task;
responsive to a determination that the first device is able to perform the user interface related task, send an acknowledgement to the second device, wherein the acknowledgement indicates that the first device is ready to receive requests associated with the user interface related task; and
responsive to the acknowledgement being sent, receive, from the second device, data to create the delegated user interface on the first device, and content to be displayed on the delegated user interface, wherein the delegated user interface is a replica of the at least the portion of the user interface of the second device.
US15/626,231 2013-06-24 2017-06-19 User interface delegation to a delegated device Abandoned US20170289239A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/626,231 US20170289239A1 (en) 2013-06-24 2017-06-19 User interface delegation to a delegated device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
PCT/US2013/047260 WO2014209255A1 (en) 2013-06-24 2013-06-24 User interface delegation to a delegated device
US201414342377A 2014-03-01 2014-03-01
US15/626,231 US20170289239A1 (en) 2013-06-24 2017-06-19 User interface delegation to a delegated device

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US14/342,377 Continuation US9699243B2 (en) 2013-06-24 2013-06-24 User interface delegation to a delegated device
PCT/US2013/047260 Continuation WO2014209255A1 (en) 2013-06-24 2013-06-24 User interface delegation to a delegated device

Publications (1)

Publication Number Publication Date
US20170289239A1 true US20170289239A1 (en) 2017-10-05

Family

ID=52142409

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/342,377 Expired - Fee Related US9699243B2 (en) 2013-06-24 2013-06-24 User interface delegation to a delegated device
US15/626,231 Abandoned US20170289239A1 (en) 2013-06-24 2017-06-19 User interface delegation to a delegated device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/342,377 Expired - Fee Related US9699243B2 (en) 2013-06-24 2013-06-24 User interface delegation to a delegated device

Country Status (2)

Country Link
US (2) US9699243B2 (en)
WO (1) WO2014209255A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3013093C (en) * 2016-04-08 2019-12-17 Husqvarna Ab Intelligent watering system
US10834231B2 (en) * 2016-10-11 2020-11-10 Synergex Group Methods, systems, and media for pairing devices to complete a task using an application request
US20180284704A1 (en) * 2017-03-31 2018-10-04 Otis Elevator Company Multi-target dynamic ui action element

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2410658B (en) 2002-10-14 2006-03-01 Toshiba Res Europ Ltd Methods and systems for flexible delegation
US7181698B2 (en) 2002-12-16 2007-02-20 Sap Ag Detachable tabs presenting accessed objects in a multi-tab interface
US8698751B2 (en) 2010-10-01 2014-04-15 Z124 Gravity drop rules and keyboard display on a multiple screen device
US7487454B2 (en) 2004-04-26 2009-02-03 Microsoft Corporation Managing arbitrary window regions for more effective use of screen space
GB0501115D0 (en) 2005-01-19 2005-02-23 Innovision Res & Tech Plc Combined power coupling and rf communication apparatus
US8495244B2 (en) * 2005-06-29 2013-07-23 Jumpstart Wireless Corporation System and method for dynamic automatic communication path selection, distributed device synchronization and task delegation
ES2368366T3 (en) * 2007-05-07 2011-11-16 Vorne Industries, Inc. METHOD AND SYSTEM TO EXPAND THE CAPACITIES OF INTEGRATED DEVICES THROUGH NETWORK CUSTOMERS.
CN101321136B (en) * 2007-06-05 2012-08-08 华为技术有限公司 Transmission-receiving proxy method for conversation initial protocol message and corresponding processor
US8613044B2 (en) 2007-06-22 2013-12-17 4Dk Technologies, Inc. Delegating or transferring of access to resources between multiple devices
US8230024B2 (en) * 2007-06-28 2012-07-24 Microsoft Corporation Delegating instant messaging sessions
US8495213B2 (en) 2008-04-10 2013-07-23 Lg Electronics Inc. Terminal and method for managing secure devices
US8144232B2 (en) 2008-07-03 2012-03-27 Sony Ericsson Mobile Communications Ab Camera system and method for picture sharing using geotagged pictures
EP2309369B1 (en) 2008-07-25 2016-09-28 NEC Corporation Information processing device, information processing program, and display control method
US20100060547A1 (en) 2008-09-11 2010-03-11 Sony Ericsson Mobile Communications Ab Display Device and Method for Displaying Images in a Variable Size Display Area
US8375328B2 (en) 2009-11-11 2013-02-12 Google Inc. Implementing customized control interfaces
US8447820B1 (en) 2011-01-28 2013-05-21 Decision Lens, Inc. Data and event synchronization across distributed user interface modules
US9130899B1 (en) 2011-04-27 2015-09-08 Cisco Technology, Inc. Integrated user interface for unified communications applications
US20130132885A1 (en) 2011-11-17 2013-05-23 Lenovo (Singapore) Pte. Ltd. Systems and methods for using touch input to move objects to an external display and interact with objects on an external display
US8786517B2 (en) 2012-02-21 2014-07-22 Blackberry Limited System and method for displaying a user interface across multiple electronic devices
EP2847686B1 (en) 2012-05-07 2019-10-30 Digital Guardian, Inc. Enhanced document and event mirroring for accessing content
US9083658B2 (en) * 2012-09-24 2015-07-14 Steelseries Aps Method and apparatus for delegating resources between devices
US9392077B2 (en) * 2012-10-12 2016-07-12 Citrix Systems, Inc. Coordinating a computing activity across applications and devices having multiple operation modes in an orchestration framework for connected devices
US20140280962A1 (en) * 2013-03-15 2014-09-18 Openpeak Inc. Method and system for delegating functionality based on availability
WO2016036769A2 (en) * 2014-09-02 2016-03-10 Apple Inc. Communicating mapping application data between electronic devices
US20160105528A1 (en) * 2014-10-08 2016-04-14 Microsoft Corporation Client-assisted fulfillment of a resource request

Also Published As

Publication number Publication date
WO2014209255A1 (en) 2014-12-31
US20150237111A1 (en) 2015-08-20
US9699243B2 (en) 2017-07-04

Similar Documents

Publication Publication Date Title
US11159626B2 (en) Session transfer between resources
CN109564531B (en) Clipboard repository interaction
US10572124B2 (en) Bound based contextual zoom
US9553953B2 (en) Method and apparatus for extending capabilities of a virtualization domain to support features available in a normal desktop application
US9117087B2 (en) System and method for creating a secure channel for inter-application communication based on intents
US7624192B2 (en) Framework for user interaction with multiple network devices
JP6527535B2 (en) Device authentication and pairing using machine readable code
KR102249197B1 (en) User terminal apparatus, communication system and control method thereof
WO2017024842A1 (en) Internet access authentication method, client, computer storage medium
US9578113B2 (en) Method and apparatus for transferring remote session data
US20140201377A1 (en) Portal multi-device session context preservation
JP2014531650A (en) Group opt-in link
US8755771B2 (en) System, method, and program for generating screen
GB2502739A (en) Secure transfer of files between applications on a mobile device using keys supplied by a server
US11647086B2 (en) System and method for maintaining user session continuity across multiple devices and/or multiple platforms
TW201621706A (en) Sharing content with permission control using near field communication
US10372512B2 (en) Method and apparatus for automatic processing of service requests on an electronic device
US20170289239A1 (en) User interface delegation to a delegated device
US9300787B1 (en) Augmented device interaction through services
US20190274030A1 (en) System and Method for Contact Information Exchange Using Ultrasonic Waves
Shekhar et al. Remote Access to PC Using Android Phone.
CN116931778A (en) Information processing method, device, equipment, medium and program product

Legal Events

Date Code Title Description
AS Assignment

Owner name: EMPIRE TECHNOLOGY DEVELOPMENT LLC, DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HIRSHBERG, DAVID;REEL/FRAME:042880/0579

Effective date: 20130505

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: CRESTLINE DIRECT FINANCE, L.P., TEXAS

Free format text: SECURITY INTEREST;ASSIGNOR:EMPIRE TECHNOLOGY DEVELOPMENT LLC;REEL/FRAME:048373/0217

Effective date: 20181228

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: EMPIRE TECHNOLOGY DEVELOPMENT LLC, WASHINGTON

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CRESTLINE DIRECT FINANCE, L.P.;REEL/FRAME:056430/0786

Effective date: 20210526

AS Assignment

Owner name: VIBRANT LICENSING LLC, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EMPIRE TECHNOLOGY DEVELOPMENT LLC;REEL/FRAME:056665/0401

Effective date: 20210525