WO2013011197A2 - Method and apparatus for triggering a remote data entry interface - Google Patents

Method and apparatus for triggering a remote data entry interface Download PDF

Info

Publication number
WO2013011197A2
WO2013011197A2 PCT/FI2012/050733 FI2012050733W WO2013011197A2 WO 2013011197 A2 WO2013011197 A2 WO 2013011197A2 FI 2012050733 W FI2012050733 W FI 2012050733W WO 2013011197 A2 WO2013011197 A2 WO 2013011197A2
Authority
WO
WIPO (PCT)
Prior art keywords
remote
data entry
interface
input interface
entry input
Prior art date
Application number
PCT/FI2012/050733
Other languages
French (fr)
Other versions
WO2013011197A3 (en
Inventor
Jörg BRAKENSIEK
Raja Bose
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to JP2014520694A priority Critical patent/JP6289368B2/en
Priority to EP12814612.3A priority patent/EP2735132B1/en
Priority to KR1020147004415A priority patent/KR20140049000A/en
Priority to CN201280045698.8A priority patent/CN103828336B/en
Publication of WO2013011197A2 publication Critical patent/WO2013011197A2/en
Publication of WO2013011197A3 publication Critical patent/WO2013011197A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C19/00Electric signal transmission systems
    • G08C19/16Electric signal transmission systems in which transmission is by pulses
    • G08C19/28Electric signal transmission systems in which transmission is by pulses using pulse code
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/20Binding and programming of remote control devices
    • G08C2201/21Programming remote control devices via third means
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface

Definitions

  • Embodiments of the present invention relate generally to the implementation of a remote user interface, and, more particularly, relate to a method and apparatus for triggering a remote data entry interface.
  • a handheld mobile device may include sufficient processing power, network connectivity, and memory storage to perform a given application, but the small form factor of a handheld mobile device may limit the usability of the application, for example, due to a small user interface and screen.
  • a remote device having a larger user interface e.g., a computer terminal, an in- vehicle head unit, a tablet or pad device
  • the user may wish to use the user interface of the remote device, rather than the user interface of the handheld device.
  • a handheld mobile computing device connect with a device having a larger display in a vehicle for displaying maps and other location information on the remote device.
  • a mobile computing device operating as a media player may also connect with another device to provide the user with an interface to the mobile computing device via display located in the traditional location for a radio in a vehicle.
  • the handheld mobile computing device may provide video and audio information to permit the reproduction of the user interface of the handheld device on the remote device.
  • the remote device should be capable of fully interfacing with the handheld device to receive user input and provide output to the user, and the handheld device should support the remote device's ability to do so.
  • Example methods and example apparatuses are described that facilitate triggering a remote data entry interface.
  • One example method embodiment includes receiving, at a device, a data entry field selection message notifying that data entry is desired, inhibiting a presentation of a data entry input interface on a display of the device, and causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
  • An additional example embodiment is an apparatus comprising at least one processor and at least one memory including computer program code. The at least one memory and the computer program code may be configured to, with the at least one processor, direct the example apparatus to perform various functionality.
  • the example apparatus may be directed to perform receiving, at a device, a data entry field selection message notifying that data entry is desired, inhibiting a presentation of a data entry input interface on a display of the device, and causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
  • Another example embodiment is an example non-transitory computer readable medium having computer program code stored thereon.
  • the computer program may direct an apparatus to perform receiving, at a device, a data entry field selection message notifying that data entry is desired, inhibiting a presentation of a data entry input interface on a display of the device, and causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
  • Another example embodiment is an apparatus comprising means for receiving, at a device, a data entry field selection message notifying that data entry is desired, means for inhibiting a presentation of a data entry input interface on a display of the device, and means for causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
  • FIG. 1 illustrates a system for implementing a remote user interface according to various example embodiments
  • FIG. 2 illustrates a user equipment displaying content and a data entry field according to various example embodiments
  • FIG. 3 illustrates a remote user interface device projecting the user interface of a user equipment according to various example embodiments
  • FIG. 4 illustrates a remote user interface device displaying a virtual keyboard for entering data into a data entry field according to various example embodiments
  • FIG. 5 is a signaling and operational flow diagram for triggering a remote data entry interface according various example embodiments
  • FIG. 6 illustrates a block diagram of an apparatus of a user equipment configured according to various example embodiments
  • FIG. 7 illustrates a block diagram of a mobile terminal configured according to various example embodiments.
  • circuitry refers to all of the following: (a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry); (b) to combinations of circuits and software (and/or firmware), such as (as applicable): (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions); and (c) to circuits, such as a microprocessor s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present.
  • circuitry would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware.
  • circuitry would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in server, a cellular network device, or other network device.
  • FIG. 1 illustrates an example system in accordance with various example embodiments of the present invention.
  • the example system includes a remote user interface (UI) device 100, User Equipment (UE) 101, and a communications link 102.
  • UI remote user interface
  • UE User Equipment
  • the remote UI device 100 may be any type of computing device configured to project or replicate the user interface of the UE 101.
  • the remote UI device 100 may include user interface components and functionality.
  • the user interface components may be controlled by one or more processors and one more memories storing program code included in the remote UI device 100 for performing the functionality of the remote UI device 100 as described herein.
  • the remote UI device 100 may include a touch screen display that is configured to receive input from a user via touch events with the display.
  • the remote UI device 100 may alternatively or additionally include other user interface hardware, such as a physical keyboard or keypad, a mouse, a trackball, or other pointing device, speakers, a microphone, and the like.
  • the remote UI device 100 may support various techniques of receiving user input including but not limited to voice recognition, handwriting recognition, and the like.
  • the remote UI device 100 may be installed in a vehicle and the user interface that is provided by the remote UI device 100 may be a modified variation of the user interface of the UE 101 that complies with safety requirements for use in a vehicular environment.
  • the remote UI device 100 may include speakers, a microphone, and the like.
  • the remote UI device 100 may also include a wireless communications interface for communicating with the UE 101 via the communications link 102.
  • the remote UI device 100 and the UE 101 may communicate via a wired link.
  • the communications link 102 may be any type communications link capable of supporting communications between the remote UI device 100 and the UE 101.
  • the communications link 102 may be a WLAN, Bluetooth, or other type of wireless link.
  • the UE 101 may be any type of mobile computing and communications device.
  • the UE 101 may be a smart phone, tablet, or pad device.
  • the UE 101 may also be configured to execute and implement applications via at least one processor and at least one memory included within the UE 101.
  • the UE 101 may be configured to, via the communications connection 102, direct the remote UI device 100 to output a user interface and receive user input provided via the remote UI device 100.
  • the projected user interface provided by the remote UI device 100 may be the same interface that is being presented on a display of the UE 101 or that would have been presented had the display of the UE 101 been active.
  • framebuffer scanning or similar techniques may be used to reproduce at least a portion of a user interface on the of the remote UI device 100 via the communications link 102.
  • the remote UI device 100 may provide a modified user interface that is derived from the user interface of the UE 101.
  • the remote UI device 100 is installed in a vehicle as a vehicle head unit.
  • the driver of the vehicle may wish to use the remote UI device 100 as an interface to the UE 101 due, for example, to the convenient and safe location of the remote UI device 100 within the vehicle and/or the larger size of the screen.
  • the UE 101 may be configured to link with the remote UI device 100, and direct the remote UI device 100 to present a user interface for engaging the user via the remote UI device 100.
  • the display of the remote UI device 100 may include various controls that may or may not be associated with controls on the user interface of the UE 101, such as controls that are affixed to a steering wheel of a vehicle, touch controls, rotary knobs, and/or other configurable or dedicated buttons.
  • the user interface provided by the remote UI device 100 may be a modified variation of the user interface of the UE 101 that is adapted for ease of use by a user that is also operating a moving vehicle.
  • the interaction between the UE 101 and the remote UI device 100 provides an example of mobile device interoperability, which may also be referred to as smart space, remote environment, and remote client.
  • the UE 101 may be described as being in the "terminal mode" when the remote UI device 100 is accessed and controlled the UE 101.
  • the features and capabilities of the UE 101 may be projected onto an external environment (e.g., the remote UI device 100), and the external environment may appear as if the features and capabilities are inherent to external environment such that the dependency on the UE 101 is not apparent to a user. Projecting the UE 101 's features and capabilities may involve exporting the user interface screen of the UE 101, as well as command and control to the external environment whereby, the user may comfortably interact with the external environment in lieu of the UE 101.
  • the touch event may be detected and located on the touch screen. Information about the touch event may then be sent to the UE 101 (e.g., using a Virtual Networking Computing (VNC) protocol or any other remote UI protocol).
  • VNC Virtual Networking Computing
  • the UE 101 upon receiving a remote touch event, may emulate the touch event locally on the UE 101.
  • a touch event within a data entry field may then trigger the presentation of a virtual keyboard on the display of the UE 101, which may then also be replicated remotely on the display of the remote UI device 100, possibly using VNC or any other remote UI protocol.
  • VNC virtual keyboard
  • the remote UI device 100 may present a specialized (driving-safe) virtual data entry user interface, such as a virtual keyboard or rotary speller implemented via controls on a steering wheel for use while driving.
  • the UE 101 may send a remote interface trigger message to the remote UI device 100.
  • a remote interface trigger message may be a VNC Virtual Keyboard Trigger message that is part of the some Terminal Mode specifications.
  • FIGs. 2 and 3 illustrate and example scenario where the user interface of the UE 101 is being projected or replicated onto the display of the remote UI device 100 because the UE is in the terminal mode.
  • FIG. 2 illustrates the UE 101 having navigated to a particular website.
  • the content of the web site 104 is displayed together with a data entry field 103 that has the current uniform resource locator (URL) as the current data content (data value) in the data entry field 103.
  • FIG. 3 provides an illustration of an example remote UI device 100 that is providing a projected user interface of the UE 101 of FIG. 2.
  • the content 104 is projected to the remote UI device 100 as content 104a and the data entry field 103 is projected to the data entry field 103a.
  • the user interface of the remote UI device 100 also presents additional controls 105 that may facilitate safe use the remote UI device 100 in, for example, a vehicular environment.
  • the remote UI device 100 may also need to facilitate a user's ability to input data via the remote UI device 100, to be provided to the UE 101.
  • a user may wish to enter data (e.g., text) into a data entry field (e.g., a text field) that has been projected to the remote UI device 100, such as the data entry field 103 a of FIG. 3.
  • the user may select the data entry field 103a (e.g., via a touch of the field on the screen) of the remote UI device 100.
  • a virtual keyboard may be provided, that may cover a portion of the screen to permit user input of, for example, text characters.
  • the remote UI device 100 may be configured to similarly supply the user with a virtual keyboard or some other type of data entry input interface.
  • the remote data entry interface of the remote UI device 100 may be modified to, for example, be displayed as a larger keyboard to facilitate ease of use during driving.
  • the virtual keyboard that is used on the remote UI device 100 may be an over-sized or full-screen keyboard with a data entry field/box combination.
  • FIG. 4 illustrates an example of an oversized virtual keyboard 106 with a data entry field 107 being displayed on the remote UI device 100.
  • the particular keyboard to be presented on the remote UI device 100 may be triggered for presentation by a remote interface trigger message such as a VNC Virtual Keyboard Trigger message, which may be sent from the UE 101.
  • an interface trigger event such as a virtual key entry event
  • the interface trigger event may be intercepted thereby inhibiting the presentation of a data entry input interface on the display of the UE 101.
  • the UE 100 may then send a remote interface trigger message, such as a Virtual Keyboard Trigger message, to the remote UI device 100 to cause presentation of a remote data entry input interface on the remote UI device's display.
  • a data entry interface disabled event e.g., a virtual key entry disabled event
  • the data entry interface disabled event may be sent to the remote UI device 100 to cause removal of the remote data entry input interface from the display of the remote UI device 100.
  • a data entry interface disabled event can occur if the UE 101 presents, in addition to the remote data entry input interface, a list of predetermined values (such as search history) for a data entry field, and the user selects one of the values from the list instead of entering data via the remote data entry input interface.
  • the UE 101 in addition to intercepting the interface trigger event, may be configured to intercept the type of data entry input interface that is being triggered for use on the UE 101.
  • the various types of data entry input interfaces may include a QWERTY keyboard, numeric keypad, dialer, or the like.
  • the UE 101 may therefore be configured to send a remote interface trigger message and the type of remote data entry input interface to the remote UI device 100, and the remote UI device 100 may responsively present the appropriate remote data entry input interface based on the received type.
  • the UE 101 in addition to intercepting the interface trigger event, may be configured to optionally intercept the relative position and the relative size of the data entry input interface that is being triggered for use on the UE 101.
  • the data entry input interface may be positioned at a specific location relative to the entire display and may also be of a specific size relative to the total display area. For example, in case a list of predetermined values (such as search history) is presented, the data entry input interface may occupy only 50% of the total screen area as opposed to occupying the entire screen.
  • the UE 101 may therefore be optionally configured to send a remote interface trigger message and the desired relative position (x, y coordinate offsets) and relative size of the remote data entry input interface to the remote UI device 100, and the remote UI device 100 may responsively present the appropriate remote data entry input interface based on the received position and size information.
  • the remote UI device 100 may be connected to one or more vehicle control and/or monitoring systems to receive vehicle context information.
  • the context information may include parameters such as speed, visibility conditions, cruise control state, and the like.
  • the remote UI device 101 may be configured to consider the context information when determining the type of remote data entry input interface to present on the screen of the remote UI device 100. For example, if the vehicle speed is high based on defined thresholds, and a QWERTY keyboard is needed, then a rotary non-touch speller controlled from the steering wheel can be displayed rather than the touch- based QWERTY virtual keyboard.
  • Intercepting the interface trigger event may one manner in which a data entry input interface of the UE 101 can be inhibited.
  • a terminal mode application may be used that temporarily replaces a local user interface application (e.g., a local virtual keyboard application) on the UE 101.
  • This terminal mode application may be configured to only send remote interface trigger messages to the remote UI device 100 and does not generate interface trigger events local to the UE 101 while the application is being implemented. In this manner, implementation of the terminal mode application may operate to inhibit the presentation of the data entry input interface on the UE 101.
  • another option for inhibiting the presentation of a data entry input interface on a UE 101 may be to detect and prevent the intended start of the local user interface application which is configured to present the data entry input interface (e.g., the local virtual keyboard application) and send the a remote interface trigger message instead of starting the local user interface application.
  • the intended start of the local user interface application which is configured to present the data entry input interface (e.g., the local virtual keyboard application) and send the a remote interface trigger message instead of starting the local user interface application.
  • inhibiting the presentation of a data entry input interface on a UE 101 may involve monitoring and intercepting the inter-process
  • This option may be implemented in situations where, for example, the local user interface application that causes the data entry input interface to be presented is continuously running, possibly in the background.
  • FIG. 5 illustrates a signaling and flow diagram of example methods of the present invention from a system perspective, as well as, from the perspectives of each of the remote UI device 100 and the UE 101.
  • the remote UI device 100 and the UE 101 share a communications connection that permits the user interface of the UE 101 (or a subset thereof) to be projected or replicated onto the user interface of the remote UI device 100.
  • the remote UI device 100 may be connected to the UE 101 using a remote framebuffer/desktop protocol while implementing a terminal mode that projects the UE 101 's screen, or a portion of the screen, on the display of the remote UI device 100.
  • the remote UI device 100 receives a data entry field selection.
  • a user may touch a data entry field (e.g., a text entry field) on the display of the remote UI device 100 to perform the selection that is received by the remote UI device 100.
  • the remote UI device 100 may be configured to transmit an indication of the selection at 121 in the form of a data entry field selection message.
  • the UE 101 may receive the data entry field selection message at 122 as a notification that selection of a data entry field has occurred.
  • the data entry field selection message may include a description of a touch event at particular coordinates of the display and the UE 101 may determine that the event is a selection of a data entry field upon analyzing the coordinates relative to the current presented content.
  • the data entry field selection message may be sent via a remote protocol, for example as a VNC Pointer Event message, to the UE 101.
  • the UE 101 may be configured to locally emulate a local field selection based on the parameters of the data entry field selection message.
  • the UE 101 may generate an indication, possibly locally, that a data entry input interface is needed to permit a user to input data.
  • the UE 101 may be notified that data entry by a user is desired.
  • this generated indication need not originate from the remote UI device 100.
  • the UE 101 may be implementing an application that may require data entry at, for example, a particular time, or based on some other criteria that is not dependent on the remote UI device 100.
  • the UE 101 may be configured to inhibit the presentation of the data entry input interface as described above.
  • the UE 101 may be configured to generate an interface trigger event, local to the UE 101, to present the data entry input interface on a display of the UE 101 in response to receiving the data entry field selection. Inhibiting the presentation of the data entry input interface may include intercepting and suppressing the trigger event. In some example embodiments, inhibiting the presentation of the data entry input interface may include implementing a terminal mode application on the UE 101, where the terminal mode application is configured to inhibit generation of a trigger event that would cause the presentation of the data entry input interface on the display of the UE 101.
  • the UE 101 may be configured to transmit a remote interface trigger message at 125 and the remote UI device 100 may receive the remote interface trigger message at 126. In this manner, the UE 101 may trigger the presentation of a remote data entry interface (e.g., a virtual keyboard) on the remote IU device 100, which may enable the entry of data in an associated data entry field.
  • the remote interface trigger message may be a VNC Terminal Mode (TM) Virtual Keyboard Trigger message, and the message may notify the remote UI device 100 of the need for data (e.g., text) input support.
  • TM VNC Terminal Mode
  • the remote UI device 100 may be configured to present a remote data entry input interface to permit user entry of data via the interface.
  • the type of remote data entry input interface may be determined based on information provided in the remote interface trigger message and/or based on context information of a vehicle that the remote UI device 100 is installed within.
  • inhibiting the presentation of the data entry input interface on the UE 101 and responsive ly triggering the presentation of a remote data entry input interface on the remote UI device 100 provides a number of advantages.
  • a stable and reliable trigger can be implemented for presenting the remote data entry input interface, where false positives and misses are reduced or eliminated, and the triggering can be based on user action.
  • the triggering can be based on user action.
  • context information of a vehicle associated with the remote UI device 100 may be used to determine which type of remote data entry input interface is to be presented.
  • FIGs. 6 and 7 illustrate example apparatus embodiments of the present invention configured to perform the various functionalities described herein.
  • FIG. 6 depicts an example apparatus that is configured to perform various functionalities from the perspective of a UE (e.g., UE 101) as described with respect to FIGs. 1-5 and as generally described herein.
  • FIG. 6 depicts an example apparatus that is configured to perform various functionalities from the perspective of a UE (e.g., UE 101) as described with respect to FIGs. 1-5 and as generally described herein.
  • FIG. 7 depicts an example UE apparatus in the form of a more specific mobile terminal configured to perform various functionalities from the perspective of a UE 101 depicted in FIGs. 1-5 and as generally described herein.
  • the example apparatuses depicted in FIGs. 6 and 7 may also be configured to perform example methods of the present invention, such as those described with respect to FIG. 5.
  • the apparatus 200 may, be embodied as, or included as a component of, a communications device with wired and/or wireless communications capabilities.
  • the apparatus 200 may be configured to operate in accordance with the functionality of a UE as described herein.
  • the apparatus 200 may be part of a communications device (e.g., UE 101), such as a stationary or a mobile terminal.
  • the apparatus 200 may be a mobile computer, mobile telephone, a portable digital assistant (PDA), a pager, a mobile television, a gaming device, a mobile computer, a laptop computer, a camera, a video recorder, an audio/video player, a radio, smart phone, tablet or pad device and/or a global positioning system (GPS) device, any combination of the aforementioned, or the like.
  • PDA portable digital assistant
  • apparatus 200 may also include computing capabilities.
  • the example apparatus 200 includes or is otherwise in communication with a processor
  • the processor 205 may be embodied as various means for implementing the various functionalities of example embodiments of the present invention including, for example, a microprocessor, a coprocessor, a controller, a special-purpose integrated circuit such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), or a hardware accelerator, processing circuitry or the like.
  • processor 205 may be representative of a plurality of processors, or one or more multiple core processors, operating in concert.
  • the processor 205 may be comprised of a plurality of transistors, logic gates, a clock (e.g., oscillator), other circuitry, and the like to facilitate performance of the functionality described herein.
  • the processor 205 may, but need not, include one or more accompanying digital signal processors.
  • the processor 205 is configured to execute instructions stored in the memory device 210 or instructions otherwise accessible to the processor 205.
  • the processor 205 may be configured to operate such that the processor causes the apparatus 200 to perform various functionalities described herein.
  • the processor 205 may be an entity capable of performing operations according to embodiments of the present invention while configured accordingly.
  • the processor 205 is specifically configured hardware for conducting the operations described herein.
  • the instructions specifically configure the processor 205 to perform the algorithms and operations described herein (e.g., those described with respect to FIG. 5).
  • the processor 205 is a processor of a specific device (e.g., a mobile terminal) configured for employing example embodiments of the present invention by further configuration of the processor 205 via executed instructions for performing the algorithms, methods, and operations described herein.
  • a specific device e.g., a mobile terminal
  • the memory device 210 may be one or more non-transitory computer-readable storage media that may include volatile and/or non- volatile memory.
  • the memory device 210 includes Random Access Memory (RAM) including dynamic and/or static RAM, on- chip or off-chip cache memory, and/or the like.
  • RAM Random Access Memory
  • memory device 210 may include non-volatile memory, which may be embedded and/or removable, and may include, for example, read-only memory, flash memory, magnetic storage devices (e.g., hard disks, floppy disk drives, magnetic tape, etc.), optical disc drives and/or media, non- volatile random access memory (NVRAM), and/or the like.
  • Memory device 210 may include a cache area for temporary storage of data. In this regard, some or all of memory device 210 may be included within the processor 205.
  • the memory device 210 which may be one or more memory devices, may be configured to store information, data, applications, computer-readable program code instructions, and/or the like for enabling the processor 205 and the example apparatus 200 to carry out various functions in accordance with example embodiments of the present invention described herein.
  • the memory device 210 could be configured to buffer input data for processing by the processor 205.
  • the memory device 210 may be configured to store instructions for execution by the processor 205.
  • the I/O interface 206 may be any device, circuitry, or means embodied in hardware, software, or a combination of hardware and software that is configured to interface the processor 205 with other circuitry or devices, such as the communications interface 220 and the user interface 215.
  • the processor 205 may interface with the memory 210 via the I/O interface 206.
  • the I/O interface 206 may be configured to convert signals and data into a form that may be interpreted by the processor 205.
  • the I/O interface 206 may also perform buffering of inputs and outputs to support the operation of the processor 205.
  • the processor 205 and the I/O interface 206 may be combined onto a single chip or integrated circuit configured to perform, or cause the apparatus 200 to perform, various functionalities of the present invention.
  • the communication interface 220 may be any device or means (e.g., circuitry) embodied in hardware, a computer program product, or a combination of hardware and a computer program product that is configured to receive and/or transmit data from/to a network 225 and/or any other device or module in communication with the example apparatus 200 (e.g., remote UI device 100).
  • the communications interface may be configured to communicate information via any type of wired or wireless connection, and via any type of communications protocol, such as a communications protocol that supports cellular communications or near field communications.
  • the communication interface 220 may be configured to support the transmission and reception of communications in a variety of networks including, but not limited to Internet Protocol- based networks (e.g., the Internet), cellular networks, or the like. Further, the communications interface 220 may be configured to support device-to- device communications, such as in a mobile ad hoc network (MANET). Processor 205 may also be configured to facilitate communications via the communications interface 220 by, for example, controlling hardware comprised within the communications interface 220.
  • networks including, but not limited to Internet Protocol- based networks (e.g., the Internet), cellular networks, or the like.
  • the communications interface 220 may be configured to support device-to- device communications, such as in a mobile ad hoc network (MANET).
  • MANET mobile ad hoc network
  • Processor 205 may also be configured to facilitate communications via the communications interface 220 by, for example, controlling hardware comprised within the communications interface 220.
  • the communication interface 220 may comprise, for example, communications driver circuitry (e.g., circuitry that supports wired communications via, for example, fiber optic connections), one or more antennas, a transmitter, a receiver, a transceiver and/or supporting hardware, including, for example, a processor for enabling communications.
  • the example apparatus 200 may communicate with various other network entities in a device-to-device fashion and/or via indirect communications via a base station, access point, server, gateway, router, or the like.
  • the user interface 215 may be in communication with the processor 205 to receive user input via the user interface 215 and/or to present output to a user as, for example, audible, visual, mechanical or other output indications.
  • the user interface 215 may include, for example, a keyboard, a mouse, a joystick, a display (e.g., a touch screen display), a microphone, a speaker, or other input/output mechanisms.
  • the processor 205 may comprise, or be in communication with, user interface circuitry configured to control at least some functions of one or more elements of the user interface.
  • the processor 205 and/or user interface circuitry may be configured to control one or more functions of one or more elements of the user interface through computer program instructions (e.g., software and/or firmware) stored on a memory accessible to the processor 205 (e.g., volatile memory, non- volatile memory, and/or the like).
  • the user interface circuitry is configured to facilitate user control of at least some functions of the apparatus 200 through the use of a display and configured to respond to user inputs.
  • the processor 205 may also comprise, or be in communication with, display circuitry configured to display at least a portion of a user interface, the display and the display circuitry configured to facilitate user control of at least some functions of the apparatus 200.
  • the remote Ul server module 230 of example apparatus 200 may be any means or device embodied, partially or wholly, in hardware, a non-transitory computer readable medium having a computer program stored thereon, or a combination of hardware and a non-transitory computer readable medium having a computer program stored thereon, such as processor 205 implementing stored instructions to configure the example apparatus 200, or a hardware configured processor 205, that is configured to carry out the functions of the remote Ul server module 230 as described herein.
  • the processor 205 includes, or controls, the remote Ul server module 230.
  • the remote Ul server module 230 may be, partially or wholly, embodied as processors similar to, but separate from processor 205.
  • the remote Ul server module 230 may be in communication with the processor 205.
  • the remote Ul server module 230 may, partially or wholly, reside on differing apparatuses such that some or all of the functionality of the remote Ul server module 230 may be performed by a first apparatus, and the remainder of the functionality of the remote Ul server module 230 may be performed by one or more other apparatuses.
  • the apparatus 200 and the processor 205 may be configured to perform the following functionality via the remote Ul server module 230.
  • the remote Ul server module 230 may be configured to receive, at the processor 205 and/or the apparatus 200, a data entry field selection message or other type of notification that data entry is desired.
  • the remote Ul server module 230 may also be configured to inhibit a presentation of a data entry input interface on a display of the device, and cause a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
  • the remote Ul server module 230 may be further configured to generate an interface trigger event, local to the device, to present the data entry input interface on the display of the device in response to receiving the data entry field selection message.
  • the remote Ul server module 230 being configured to inhibit the presentation of the data entry input interface includes being configured to intercept the interface trigger event to thereby inhibit the presentation of the data entry input interface on the display of the device.
  • the remote Ul server module 230 may be further configured to cause the remote interface trigger message to be sent to the remote device in response to intercepting the interface trigger event.
  • the remote Ul server module 230 may be additionally or alternatively be configured to implement a terminal mode application on the device, the terminal mode application being configured to inhibit generation of an interface trigger event that would cause the presentation of the data entry input interface on the display of the device.
  • the remote Ul server module 230 may be configured to cause the remote interface trigger message to be sent to the remote device in response to intercepting the interface trigger event, the remote interface trigger message including the type of data entry input interface to be presented. Additionally or alternatively, the remote UI server module 230 may be configured to cause the remote interface trigger message to be sent to the remote device to direct the remote device to present a remote data entry input interface on a display of the remote device, the remote data entry input interface being presented based on context information of an environment in which the remote device is installed.
  • the example apparatus of FIG. 7 is a mobile terminal 10 configured to communicate within a wireless network, such as a cellular communications network.
  • the mobile terminal 10 may be configured to perform the functionality of UE 101 and/or apparatus 200 as described herein.
  • the mobile terminal 10 may be caused to perform the functionality of the remote UI server module 230 via the processor 20.
  • processor 20 may be an integrated circuit or chip configured similar to the processor 205 together with, for example, the I/O interface 206.
  • volatile memory 40 and non- volatile memory 42 may configured to support the operation of the processor 20 as computer readable storage media.
  • the mobile terminal 10 may further include an antenna 12, a transmitter 14, and a receiver 16, which may be included as parts of a communications interface of the mobile terminal 10.
  • the speaker 24, the microphone 26, the display 28, and the keypad 30 may be included as parts of a user interface.
  • FIG. 5 illustrates flowcharts of example systems, methods, and/or computer programs stored on a non-transitory computer readable medium (e.g., computer program product) according to example embodiments of the invention.
  • a non-transitory computer readable medium e.g., computer program product
  • Means for implementing the blocks or operations of the flowcharts, combinations of the blocks or operations in the flowchart, or other functionality of example embodiments of the present invention described herein may include hardware, and/or a non-transitory computer- readable storage medium having one or more computer program code instructions, program instructions, or executable computer-readable program code instructions stored therein.
  • program code instructions may be stored on a memory device, such as memory device 210, of an example apparatus, such as example apparatus 200, and executed by a processor, such as processor 205.
  • any such program code instructions may be loaded onto a computer or other programmable apparatus (e.g., processor 205, memory device 210, or the like) from a computer-readable storage medium to produce a particular machine, such that the particular machine becomes a means for implementing the functions specified in the flowcharts' block(s) or operation(s).
  • These program code instructions may also be stored in a computer-readable storage medium that can direct a computer, a processor, or other programmable apparatus to function in a particular manner to thereby generate a particular machine or particular article of manufacture.
  • the instructions stored in the computer-readable storage medium may produce an article of manufacture, where the article of manufacture becomes a means for implementing the functions specified in the flowcharts' block(s) or operation(s).
  • the program code instructions may be retrieved from a computer-readable storage medium and loaded into a computer, processor, or other programmable apparatus to configure the computer, processor, or other programmable apparatus to execute operations to be performed on or by the computer, processor, or other programmable apparatus.
  • Retrieval, loading, and execution of the program code instructions may be performed sequentially such that one instruction is retrieved, loaded, and executed at a time. In some example embodiments, retrieval, loading and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together.
  • Execution of the program code instructions may produce a computer- implemented process such that the instructions executed by the computer, processor, or other programmable apparatus provide operations for implementing the functions specified in the flowcharts' block(s) or operation(s).
  • execution of instructions associated with the blocks or operations of the flowchart by a processor, or storage of instructions associated with the blocks or operations of the flowcharts in a computer-readable storage medium support combinations of operations for performing the specified functions. It will also be understood that one or more blocks or operations of the flowcharts, and combinations of blocks or operations in the flowcharts, may be implemented by special purpose hardware-based computer systems and/or processors which perform the specified functions, or combinations of special purpose hardware and program code instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Telephone Function (AREA)

Abstract

Various methods triggering a remote data entry interface are provided. One example method includes receiving, at a device, a data entry field selection message notifying that data entry is desired, inhibiting a presentation of a data entry input interface on a display of the device, and causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device. Similar and related example methods and example apparatuses are also provided.

Description

METHOD AND APPARATUS FOR TRIGGERING A REMOTE DATA ENTRY INTERFACE
TECHNICAL FIELD
[0001] Embodiments of the present invention relate generally to the implementation of a remote user interface, and, more particularly, relate to a method and apparatus for triggering a remote data entry interface.
BACKGROUND
[0002] Mobile computing devices continue to evolve such that the devices are capable of supporting new and powerful applications. In some instances a handheld mobile device may include sufficient processing power, network connectivity, and memory storage to perform a given application, but the small form factor of a handheld mobile device may limit the usability of the application, for example, due to a small user interface and screen.
[0003] As such, in situations where the user may be stationary, relative to a remote device having a larger user interface (e.g., a computer terminal, an in- vehicle head unit, a tablet or pad device), the user may wish to use the user interface of the remote device, rather than the user interface of the handheld device. For example, considering a global positioning application, a user may wish to have a handheld mobile computing device connect with a device having a larger display in a vehicle for displaying maps and other location information on the remote device. Similarly, a mobile computing device operating as a media player may also connect with another device to provide the user with an interface to the mobile computing device via display located in the traditional location for a radio in a vehicle.
[0004] To interface with and support a remote user interface environment, the handheld mobile computing device may provide video and audio information to permit the reproduction of the user interface of the handheld device on the remote device. The remote device should be capable of fully interfacing with the handheld device to receive user input and provide output to the user, and the handheld device should support the remote device's ability to do so.
BRIEF SUMMARY
[0005] Example methods and example apparatuses are described that facilitate triggering a remote data entry interface. One example method embodiment includes receiving, at a device, a data entry field selection message notifying that data entry is desired, inhibiting a presentation of a data entry input interface on a display of the device, and causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device. [0006] An additional example embodiment is an apparatus comprising at least one processor and at least one memory including computer program code. The at least one memory and the computer program code may be configured to, with the at least one processor, direct the example apparatus to perform various functionality. In this regard, the example apparatus may be directed to perform receiving, at a device, a data entry field selection message notifying that data entry is desired, inhibiting a presentation of a data entry input interface on a display of the device, and causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
[0007] Another example embodiment is an example non-transitory computer readable medium having computer program code stored thereon. When executed, the computer program may direct an apparatus to perform receiving, at a device, a data entry field selection message notifying that data entry is desired, inhibiting a presentation of a data entry input interface on a display of the device, and causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
[0008] Another example embodiment is an apparatus comprising means for receiving, at a device, a data entry field selection message notifying that data entry is desired, means for inhibiting a presentation of a data entry input interface on a display of the device, and means for causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
BRIEF DESCRIPTION OF THE DRAWING(S)
[0009] Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
[0010] FIG. 1 illustrates a system for implementing a remote user interface according to various example embodiments;
[0011] FIG. 2 illustrates a user equipment displaying content and a data entry field according to various example embodiments;
[0012] FIG. 3 illustrates a remote user interface device projecting the user interface of a user equipment according to various example embodiments;
[0013] FIG. 4 illustrates a remote user interface device displaying a virtual keyboard for entering data into a data entry field according to various example embodiments;
[0014] FIG. 5 is a signaling and operational flow diagram for triggering a remote data entry interface according various example embodiments;
[0015] FIG. 6 illustrates a block diagram of an apparatus of a user equipment configured according to various example embodiments; and [0016] FIG. 7 illustrates a block diagram of a mobile terminal configured according to various example embodiments.
DETAILED DESCRIPTION
[0017] Example embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. The terms "data," "content," "information," and similar terms may be used
interchangeably, according to some example embodiments of the present invention, to refer to data capable of being transmitted, received, operated on, and/or stored.
[0018] As used herein, the term 'circuitry' refers to all of the following: (a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry); (b) to combinations of circuits and software (and/or firmware), such as (as applicable): (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions); and (c) to circuits, such as a microprocessor s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present.
[0019] This definition of 'circuitry' applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term "circuitry" would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware. The term "circuitry" would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in server, a cellular network device, or other network device.
[0020] FIG. 1 illustrates an example system in accordance with various example embodiments of the present invention. The example system includes a remote user interface (UI) device 100, User Equipment (UE) 101, and a communications link 102.
[0021] The remote UI device 100 may be any type of computing device configured to project or replicate the user interface of the UE 101. As such, the remote UI device 100 may include user interface components and functionality. The user interface components may be controlled by one or more processors and one more memories storing program code included in the remote UI device 100 for performing the functionality of the remote UI device 100 as described herein. In some example embodiments, the remote UI device 100 may include a touch screen display that is configured to receive input from a user via touch events with the display. The remote UI device 100 may alternatively or additionally include other user interface hardware, such as a physical keyboard or keypad, a mouse, a trackball, or other pointing device, speakers, a microphone, and the like. The remote UI device 100 may support various techniques of receiving user input including but not limited to voice recognition, handwriting recognition, and the like. In some example embodiments, the remote UI device 100 may be installed in a vehicle and the user interface that is provided by the remote UI device 100 may be a modified variation of the user interface of the UE 101 that complies with safety requirements for use in a vehicular environment. Further, the remote UI device 100 may include speakers, a microphone, and the like.
[0022] The remote UI device 100 may also include a wireless communications interface for communicating with the UE 101 via the communications link 102. According to some example embodiments, the remote UI device 100 and the UE 101 may communicate via a wired link. The communications link 102 may be any type communications link capable of supporting communications between the remote UI device 100 and the UE 101. According to some example embodiments, the communications link 102 may be a WLAN, Bluetooth, or other type of wireless link. The UE 101 may be any type of mobile computing and communications device. According to some example embodiments, the UE 101 may be a smart phone, tablet, or pad device. The UE 101 may also be configured to execute and implement applications via at least one processor and at least one memory included within the UE 101.
[0023] According to some example embodiments, the UE 101 may be configured to, via the communications connection 102, direct the remote UI device 100 to output a user interface and receive user input provided via the remote UI device 100. The projected user interface provided by the remote UI device 100 may be the same interface that is being presented on a display of the UE 101 or that would have been presented had the display of the UE 101 been active. In some example embodiments, framebuffer scanning or similar techniques may be used to reproduce at least a portion of a user interface on the of the remote UI device 100 via the communications link 102. In some example embodiments, the remote UI device 100 may provide a modified user interface that is derived from the user interface of the UE 101. For example, consider an example scenario where the remote UI device 100 is installed in a vehicle as a vehicle head unit. The driver of the vehicle may wish to use the remote UI device 100 as an interface to the UE 101 due, for example, to the convenient and safe location of the remote UI device 100 within the vehicle and/or the larger size of the screen. The UE 101 may be configured to link with the remote UI device 100, and direct the remote UI device 100 to present a user interface for engaging the user via the remote UI device 100. The display of the remote UI device 100 may include various controls that may or may not be associated with controls on the user interface of the UE 101, such as controls that are affixed to a steering wheel of a vehicle, touch controls, rotary knobs, and/or other configurable or dedicated buttons. In some instances the user interface provided by the remote UI device 100 may be a modified variation of the user interface of the UE 101 that is adapted for ease of use by a user that is also operating a moving vehicle.
[0024] The interaction between the UE 101 and the remote UI device 100 provides an example of mobile device interoperability, which may also be referred to as smart space, remote environment, and remote client. In some instances, the UE 101 may be described as being in the "terminal mode" when the remote UI device 100 is accessed and controlled the UE 101. The features and capabilities of the UE 101 may be projected onto an external environment (e.g., the remote UI device 100), and the external environment may appear as if the features and capabilities are inherent to external environment such that the dependency on the UE 101 is not apparent to a user. Projecting the UE 101 's features and capabilities may involve exporting the user interface screen of the UE 101, as well as command and control to the external environment whereby, the user may comfortably interact with the external environment in lieu of the UE 101.
[0025] When the UE 101 is operating in the terminal mode, if a user touches a data entry field
(e.g., a text entry field) on the display of the remote UI device 100, the touch event may be detected and located on the touch screen. Information about the touch event may then be sent to the UE 101 (e.g., using a Virtual Networking Computing (VNC) protocol or any other remote UI protocol). The UE 101, upon receiving a remote touch event, may emulate the touch event locally on the UE 101.
[0026] A touch event within a data entry field may then trigger the presentation of a virtual keyboard on the display of the UE 101, which may then also be replicated remotely on the display of the remote UI device 100, possibly using VNC or any other remote UI protocol. However, in
implementations in a vehicle, where the driver is expected to interact with the remote UI device 100, safety and driver distraction requirements may be considered and access to the user interface of the UE 101 may be limited, and user interactions may be received by the remote UI device 100. This may be because UE 101 's virtual key entry methods may not comply with driver distraction guidelines and are therefore may not be safe to use while driving. For example, a virtual keypad implemented on the UE 101 may require the user to touch a virtual key multiple times in order to create the intended key event, which would substantially distract a user while driving. Accordingly, the remote UI device 100 may present a specialized (driving-safe) virtual data entry user interface, such as a virtual keyboard or rotary speller implemented via controls on a steering wheel for use while driving. To cause the remote UI device 100 to implement this specialized data entry user interface (also referred to as a remote data entry input interface), the UE 101 may send a remote interface trigger message to the remote UI device 100. One example of a remote interface trigger message may be a VNC Virtual Keyboard Trigger message that is part of the some Terminal Mode specifications.
[0027] FIGs. 2 and 3 illustrate and example scenario where the user interface of the UE 101 is being projected or replicated onto the display of the remote UI device 100 because the UE is in the terminal mode. FIG. 2 illustrates the UE 101 having navigated to a particular website. The content of the web site 104 is displayed together with a data entry field 103 that has the current uniform resource locator (URL) as the current data content (data value) in the data entry field 103. FIG. 3 provides an illustration of an example remote UI device 100 that is providing a projected user interface of the UE 101 of FIG. 2. The content 104 is projected to the remote UI device 100 as content 104a and the data entry field 103 is projected to the data entry field 103a. The user interface of the remote UI device 100 also presents additional controls 105 that may facilitate safe use the remote UI device 100 in, for example, a vehicular environment.
[0028] When the remote UI device 100 is projecting or replicating the user interface of the UE
101, the remote UI device 100 may also need to facilitate a user's ability to input data via the remote UI device 100, to be provided to the UE 101. In this regard, a user may wish to enter data (e.g., text) into a data entry field (e.g., a text field) that has been projected to the remote UI device 100, such as the data entry field 103 a of FIG. 3. To change the data in the data entry field, the user may select the data entry field 103a (e.g., via a touch of the field on the screen) of the remote UI device 100.
[0029] If this type of operation were taking place directly on the UE 101, a virtual keyboard may be provided, that may cover a portion of the screen to permit user input of, for example, text characters. However, in the terminal mode, the interaction is occurring between the user and the remote UI device 100. The remote UI device 100 may be configured to similarly supply the user with a virtual keyboard or some other type of data entry input interface. However, the remote data entry interface of the remote UI device 100 may be modified to, for example, be displayed as a larger keyboard to facilitate ease of use during driving. For example, the virtual keyboard that is used on the remote UI device 100 may be an over-sized or full-screen keyboard with a data entry field/box combination. FIG. 4 illustrates an example of an oversized virtual keyboard 106 with a data entry field 107 being displayed on the remote UI device 100. The particular keyboard to be presented on the remote UI device 100 may be triggered for presentation by a remote interface trigger message such as a VNC Virtual Keyboard Trigger message, which may be sent from the UE 101.
[0030] When a user touches a data entry field on the user interface of the remote UI device 100, an interface trigger event, such as a virtual key entry event, may be generated by the operating system of the UE 101, and, in accordance with various example embodiments, the interface trigger event may be intercepted thereby inhibiting the presentation of a data entry input interface on the display of the UE 101. Instead of the UE 101 displaying the data entry interface (e.g., a virtual keyboard) locally on the UE 101 's display, the UE 100 may then send a remote interface trigger message, such as a Virtual Keyboard Trigger message, to the remote UI device 100 to cause presentation of a remote data entry input interface on the remote UI device's display.
[0031] Further, if a data entry interface disabled event (e.g., a virtual key entry disabled event) occurs on the UE 101 to cause the local data entry input interface (the presentation of which may have been inhibited) to be closed or go off-screen, the data entry interface disabled event may be sent to the remote UI device 100 to cause removal of the remote data entry input interface from the display of the remote UI device 100. For example, a data entry interface disabled event can occur if the UE 101 presents, in addition to the remote data entry input interface, a list of predetermined values (such as search history) for a data entry field, and the user selects one of the values from the list instead of entering data via the remote data entry input interface.
[0032] According to some example embodiments, in addition to intercepting the interface trigger event, the UE 101 may be configured to intercept the type of data entry input interface that is being triggered for use on the UE 101. The various types of data entry input interfaces may include a QWERTY keyboard, numeric keypad, dialer, or the like. The UE 101 may therefore be configured to send a remote interface trigger message and the type of remote data entry input interface to the remote UI device 100, and the remote UI device 100 may responsively present the appropriate remote data entry input interface based on the received type.
[0033] According to some example embodiments, in addition to intercepting the interface trigger event, the UE 101 may be configured to optionally intercept the relative position and the relative size of the data entry input interface that is being triggered for use on the UE 101. Based on the user interface on the UE 101, the data entry input interface may be positioned at a specific location relative to the entire display and may also be of a specific size relative to the total display area. For example, in case a list of predetermined values (such as search history) is presented, the data entry input interface may occupy only 50% of the total screen area as opposed to occupying the entire screen. The UE 101 may therefore be optionally configured to send a remote interface trigger message and the desired relative position (x, y coordinate offsets) and relative size of the remote data entry input interface to the remote UI device 100, and the remote UI device 100 may responsively present the appropriate remote data entry input interface based on the received position and size information.
[0034] In some example embodiments, the remote UI device 100 may be connected to one or more vehicle control and/or monitoring systems to receive vehicle context information. The context information may include parameters such as speed, visibility conditions, cruise control state, and the like. The remote UI device 101 may be configured to consider the context information when determining the type of remote data entry input interface to present on the screen of the remote UI device 100. For example, if the vehicle speed is high based on defined thresholds, and a QWERTY keyboard is needed, then a rotary non-touch speller controlled from the steering wheel can be displayed rather than the touch- based QWERTY virtual keyboard.
[0035] Intercepting the interface trigger event may one manner in which a data entry input interface of the UE 101 can be inhibited. In some example embodiments, a terminal mode application may be used that temporarily replaces a local user interface application (e.g., a local virtual keyboard application) on the UE 101. This terminal mode application may be configured to only send remote interface trigger messages to the remote UI device 100 and does not generate interface trigger events local to the UE 101 while the application is being implemented. In this manner, implementation of the terminal mode application may operate to inhibit the presentation of the data entry input interface on the UE 101.
[0036] According to some example embodiments, another option for inhibiting the presentation of a data entry input interface on a UE 101 may be to detect and prevent the intended start of the local user interface application which is configured to present the data entry input interface (e.g., the local virtual keyboard application) and send the a remote interface trigger message instead of starting the local user interface application.
[0037] According to some additional example embodiments, inhibiting the presentation of a data entry input interface on a UE 101 may involve monitoring and intercepting the inter-process
communications on the UE 101. This option may be implemented in situations where, for example, the local user interface application that causes the data entry input interface to be presented is continuously running, possibly in the background.
[0038] In view of the foregoing, FIG. 5 illustrates a signaling and flow diagram of example methods of the present invention from a system perspective, as well as, from the perspectives of each of the remote UI device 100 and the UE 101. Within the context of FIG. 5, the remote UI device 100 and the UE 101 share a communications connection that permits the user interface of the UE 101 (or a subset thereof) to be projected or replicated onto the user interface of the remote UI device 100. In this regard, the remote UI device 100 may be connected to the UE 101 using a remote framebuffer/desktop protocol while implementing a terminal mode that projects the UE 101 's screen, or a portion of the screen, on the display of the remote UI device 100.
[0039] At 120, the remote UI device 100 receives a data entry field selection. In this regard, for example, a user may touch a data entry field (e.g., a text entry field) on the display of the remote UI device 100 to perform the selection that is received by the remote UI device 100. In response to the selection of a data entry field, the remote UI device 100 may be configured to transmit an indication of the selection at 121 in the form of a data entry field selection message. The UE 101 may receive the data entry field selection message at 122 as a notification that selection of a data entry field has occurred. The data entry field selection message may include a description of a touch event at particular coordinates of the display and the UE 101 may determine that the event is a selection of a data entry field upon analyzing the coordinates relative to the current presented content. The data entry field selection message may be sent via a remote protocol, for example as a VNC Pointer Event message, to the UE 101. At 123, the UE 101 may be configured to locally emulate a local field selection based on the parameters of the data entry field selection message.
[0040] According to some example embodiments, rather than receiving an indication of a selection at the remote UI device 100, the UE 101 may generate an indication, possibly locally, that a data entry input interface is needed to permit a user to input data. In this regard, the UE 101 may be notified that data entry by a user is desired. As such, this generated indication need not originate from the remote UI device 100. For example, the UE 101 may be implementing an application that may require data entry at, for example, a particular time, or based on some other criteria that is not dependent on the remote UI device 100.
[0041] At 124, the UE 101 may be configured to inhibit the presentation of the data entry input interface as described above. In this regard, according to some example embodiments, the UE 101 may be configured to generate an interface trigger event, local to the UE 101, to present the data entry input interface on a display of the UE 101 in response to receiving the data entry field selection. Inhibiting the presentation of the data entry input interface may include intercepting and suppressing the trigger event. In some example embodiments, inhibiting the presentation of the data entry input interface may include implementing a terminal mode application on the UE 101, where the terminal mode application is configured to inhibit generation of a trigger event that would cause the presentation of the data entry input interface on the display of the UE 101.
[0042] Upon detection of the intercepted attempt to trigger the data entry input interface, or in response to a notification that data entry is desired, the UE 101 may be configured to transmit a remote interface trigger message at 125 and the remote UI device 100 may receive the remote interface trigger message at 126. In this manner, the UE 101 may trigger the presentation of a remote data entry interface (e.g., a virtual keyboard) on the remote IU device 100, which may enable the entry of data in an associated data entry field. The remote interface trigger message may be a VNC Terminal Mode (TM) Virtual Keyboard Trigger message, and the message may notify the remote UI device 100 of the need for data (e.g., text) input support.
[0043] At 127, the remote UI device 100 may be configured to present a remote data entry input interface to permit user entry of data via the interface. In some example embodiments, the type of remote data entry input interface may be determined based on information provided in the remote interface trigger message and/or based on context information of a vehicle that the remote UI device 100 is installed within.
[0044] According to various example embodiments described herein, inhibiting the presentation of the data entry input interface on the UE 101 and responsive ly triggering the presentation of a remote data entry input interface on the remote UI device 100 provides a number of advantages. For example, according to some example embodiments, a stable and reliable trigger can be implemented for presenting the remote data entry input interface, where false positives and misses are reduced or eliminated, and the triggering can be based on user action. Further, according to some example embodiments,
implementation need not require changes to existing legacy applications. Additionally, according to some example embodiments, context information of a vehicle associated with the remote UI device 100 may be used to determine which type of remote data entry input interface is to be presented. [0045] The description provided above and generally herein illustrates example methods, example apparatuses, and example computer programs stored on a non-transitory computer readable medium for triggering a remote data entry interface. FIGs. 6 and 7 illustrate example apparatus embodiments of the present invention configured to perform the various functionalities described herein. FIG. 6 depicts an example apparatus that is configured to perform various functionalities from the perspective of a UE (e.g., UE 101) as described with respect to FIGs. 1-5 and as generally described herein. FIG. 7 depicts an example UE apparatus in the form of a more specific mobile terminal configured to perform various functionalities from the perspective of a UE 101 depicted in FIGs. 1-5 and as generally described herein. The example apparatuses depicted in FIGs. 6 and 7 may also be configured to perform example methods of the present invention, such as those described with respect to FIG. 5.
[0046] Referring now to FIG. 6, in some example embodiments, the apparatus 200 may, be embodied as, or included as a component of, a communications device with wired and/or wireless communications capabilities. In this regard, the apparatus 200 may be configured to operate in accordance with the functionality of a UE as described herein. In some example embodiments, the apparatus 200 may be part of a communications device (e.g., UE 101), such as a stationary or a mobile terminal. As a mobile terminal, the apparatus 200 may be a mobile computer, mobile telephone, a portable digital assistant (PDA), a pager, a mobile television, a gaming device, a mobile computer, a laptop computer, a camera, a video recorder, an audio/video player, a radio, smart phone, tablet or pad device and/or a global positioning system (GPS) device, any combination of the aforementioned, or the like. Regardless of the type of communications device, apparatus 200 may also include computing capabilities.
[0047] The example apparatus 200 includes or is otherwise in communication with a processor
205, a memory device 210, an Input/Output (I/O) interface 206, a communications interface 220, user interface 215, and a remote UI server module 230. The processor 205 may be embodied as various means for implementing the various functionalities of example embodiments of the present invention including, for example, a microprocessor, a coprocessor, a controller, a special-purpose integrated circuit such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), or a hardware accelerator, processing circuitry or the like. According to one example embodiment, processor 205 may be representative of a plurality of processors, or one or more multiple core processors, operating in concert. Further, the processor 205 may be comprised of a plurality of transistors, logic gates, a clock (e.g., oscillator), other circuitry, and the like to facilitate performance of the functionality described herein. The processor 205 may, but need not, include one or more accompanying digital signal processors. In some example embodiments, the processor 205 is configured to execute instructions stored in the memory device 210 or instructions otherwise accessible to the processor 205. The processor 205 may be configured to operate such that the processor causes the apparatus 200 to perform various functionalities described herein.
[0048] Whether configured as hardware or via instructions stored on a computer-readable storage medium, or by a combination thereof, the processor 205 may be an entity capable of performing operations according to embodiments of the present invention while configured accordingly. Thus, in example embodiments where the processor 205 is embodied as, or is part of, an ASIC, FPGA, or the like, the processor 205 is specifically configured hardware for conducting the operations described herein. Alternatively, in example embodiments where the processor 205 is embodied as an executor of instructions or computer program code stored on a non-transitory computer-readable storage medium, the instructions specifically configure the processor 205 to perform the algorithms and operations described herein (e.g., those described with respect to FIG. 5). In some example embodiments, the processor 205 is a processor of a specific device (e.g., a mobile terminal) configured for employing example embodiments of the present invention by further configuration of the processor 205 via executed instructions for performing the algorithms, methods, and operations described herein.
[0049] The memory device 210 may be one or more non-transitory computer-readable storage media that may include volatile and/or non- volatile memory. In some example embodiments, the memory device 210 includes Random Access Memory (RAM) including dynamic and/or static RAM, on- chip or off-chip cache memory, and/or the like. Further, memory device 210 may include non-volatile memory, which may be embedded and/or removable, and may include, for example, read-only memory, flash memory, magnetic storage devices (e.g., hard disks, floppy disk drives, magnetic tape, etc.), optical disc drives and/or media, non- volatile random access memory (NVRAM), and/or the like. Memory device 210 may include a cache area for temporary storage of data. In this regard, some or all of memory device 210 may be included within the processor 205.
[0050] Further, the memory device 210, which may be one or more memory devices, may be configured to store information, data, applications, computer-readable program code instructions, and/or the like for enabling the processor 205 and the example apparatus 200 to carry out various functions in accordance with example embodiments of the present invention described herein. For example, the memory device 210 could be configured to buffer input data for processing by the processor 205.
Additionally, or alternatively, the memory device 210 may be configured to store instructions for execution by the processor 205.
[0051] The I/O interface 206 may be any device, circuitry, or means embodied in hardware, software, or a combination of hardware and software that is configured to interface the processor 205 with other circuitry or devices, such as the communications interface 220 and the user interface 215. In some example embodiments, the processor 205 may interface with the memory 210 via the I/O interface 206. The I/O interface 206 may be configured to convert signals and data into a form that may be interpreted by the processor 205. The I/O interface 206 may also perform buffering of inputs and outputs to support the operation of the processor 205. According to some example embodiments, the processor 205 and the I/O interface 206 may be combined onto a single chip or integrated circuit configured to perform, or cause the apparatus 200 to perform, various functionalities of the present invention.
[0052] The communication interface 220 may be any device or means (e.g., circuitry) embodied in hardware, a computer program product, or a combination of hardware and a computer program product that is configured to receive and/or transmit data from/to a network 225 and/or any other device or module in communication with the example apparatus 200 (e.g., remote UI device 100). The communications interface may be configured to communicate information via any type of wired or wireless connection, and via any type of communications protocol, such as a communications protocol that supports cellular communications or near field communications. According to various example embodiments, the communication interface 220 may be configured to support the transmission and reception of communications in a variety of networks including, but not limited to Internet Protocol- based networks (e.g., the Internet), cellular networks, or the like. Further, the communications interface 220 may be configured to support device-to- device communications, such as in a mobile ad hoc network (MANET). Processor 205 may also be configured to facilitate communications via the communications interface 220 by, for example, controlling hardware comprised within the communications interface 220. In this regard, the communication interface 220 may comprise, for example, communications driver circuitry (e.g., circuitry that supports wired communications via, for example, fiber optic connections), one or more antennas, a transmitter, a receiver, a transceiver and/or supporting hardware, including, for example, a processor for enabling communications. Via the communication interface 220, the example apparatus 200 may communicate with various other network entities in a device-to-device fashion and/or via indirect communications via a base station, access point, server, gateway, router, or the like.
[0053] The user interface 215 may be in communication with the processor 205 to receive user input via the user interface 215 and/or to present output to a user as, for example, audible, visual, mechanical or other output indications. The user interface 215 may include, for example, a keyboard, a mouse, a joystick, a display (e.g., a touch screen display), a microphone, a speaker, or other input/output mechanisms. Further, the processor 205 may comprise, or be in communication with, user interface circuitry configured to control at least some functions of one or more elements of the user interface. The processor 205 and/or user interface circuitry may be configured to control one or more functions of one or more elements of the user interface through computer program instructions (e.g., software and/or firmware) stored on a memory accessible to the processor 205 (e.g., volatile memory, non- volatile memory, and/or the like). In some example embodiments, the user interface circuitry is configured to facilitate user control of at least some functions of the apparatus 200 through the use of a display and configured to respond to user inputs. The processor 205 may also comprise, or be in communication with, display circuitry configured to display at least a portion of a user interface, the display and the display circuitry configured to facilitate user control of at least some functions of the apparatus 200.
[0054] The remote Ul server module 230 of example apparatus 200 may be any means or device embodied, partially or wholly, in hardware, a non-transitory computer readable medium having a computer program stored thereon, or a combination of hardware and a non-transitory computer readable medium having a computer program stored thereon, such as processor 205 implementing stored instructions to configure the example apparatus 200, or a hardware configured processor 205, that is configured to carry out the functions of the remote Ul server module 230 as described herein. In an example embodiment, the processor 205 includes, or controls, the remote Ul server module 230. The remote Ul server module 230 may be, partially or wholly, embodied as processors similar to, but separate from processor 205. In this regard, the remote Ul server module 230 may be in communication with the processor 205. In various example embodiments, the remote Ul server module 230 may, partially or wholly, reside on differing apparatuses such that some or all of the functionality of the remote Ul server module 230 may be performed by a first apparatus, and the remainder of the functionality of the remote Ul server module 230 may be performed by one or more other apparatuses.
[0055] The apparatus 200 and the processor 205 may be configured to perform the following functionality via the remote Ul server module 230. In this regard, the remote Ul server module 230 may be configured to receive, at the processor 205 and/or the apparatus 200, a data entry field selection message or other type of notification that data entry is desired. The remote Ul server module 230 may also be configured to inhibit a presentation of a data entry input interface on a display of the device, and cause a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
[0056] In some example embodiments, the remote Ul server module 230 may be further configured to generate an interface trigger event, local to the device, to present the data entry input interface on the display of the device in response to receiving the data entry field selection message. In some example embodiments, the remote Ul server module 230 being configured to inhibit the presentation of the data entry input interface includes being configured to intercept the interface trigger event to thereby inhibit the presentation of the data entry input interface on the display of the device. Additionally or alternatively, the remote Ul server module 230 may be further configured to cause the remote interface trigger message to be sent to the remote device in response to intercepting the interface trigger event. In some example embodiments, the remote Ul server module 230 may be additionally or alternatively be configured to implement a terminal mode application on the device, the terminal mode application being configured to inhibit generation of an interface trigger event that would cause the presentation of the data entry input interface on the display of the device. In some example
embodiments, the remote Ul server module 230 may be configured to cause the remote interface trigger message to be sent to the remote device in response to intercepting the interface trigger event, the remote interface trigger message including the type of data entry input interface to be presented. Additionally or alternatively, the remote UI server module 230 may be configured to cause the remote interface trigger message to be sent to the remote device to direct the remote device to present a remote data entry input interface on a display of the remote device, the remote data entry input interface being presented based on context information of an environment in which the remote device is installed.
[0057] Referring now to FIG. 7, a more specific example apparatus in accordance with various embodiments of the present invention is provided. The example apparatus of FIG. 7 is a mobile terminal 10 configured to communicate within a wireless network, such as a cellular communications network. The mobile terminal 10 may be configured to perform the functionality of UE 101 and/or apparatus 200 as described herein. In some example embodiments, the mobile terminal 10 may be caused to perform the functionality of the remote UI server module 230 via the processor 20. In this regard, processor 20 may be an integrated circuit or chip configured similar to the processor 205 together with, for example, the I/O interface 206. Further, volatile memory 40 and non- volatile memory 42 may configured to support the operation of the processor 20 as computer readable storage media.
[0058] The mobile terminal 10 may further include an antenna 12, a transmitter 14, and a receiver 16, which may be included as parts of a communications interface of the mobile terminal 10. The speaker 24, the microphone 26, the display 28, and the keypad 30 may be included as parts of a user interface.
[0059] As described above, FIG. 5 illustrates flowcharts of example systems, methods, and/or computer programs stored on a non-transitory computer readable medium (e.g., computer program product) according to example embodiments of the invention. It will be understood that each block or operation of the flowcharts, and/or combinations of blocks or operations in the flowcharts, can be implemented by various means. Means for implementing the blocks or operations of the flowcharts, combinations of the blocks or operations in the flowchart, or other functionality of example embodiments of the present invention described herein may include hardware, and/or a non-transitory computer- readable storage medium having one or more computer program code instructions, program instructions, or executable computer-readable program code instructions stored therein. In this regard, program code instructions may be stored on a memory device, such as memory device 210, of an example apparatus, such as example apparatus 200, and executed by a processor, such as processor 205. As will be appreciated, any such program code instructions may be loaded onto a computer or other programmable apparatus (e.g., processor 205, memory device 210, or the like) from a computer-readable storage medium to produce a particular machine, such that the particular machine becomes a means for implementing the functions specified in the flowcharts' block(s) or operation(s). These program code instructions may also be stored in a computer-readable storage medium that can direct a computer, a processor, or other programmable apparatus to function in a particular manner to thereby generate a particular machine or particular article of manufacture. The instructions stored in the computer-readable storage medium may produce an article of manufacture, where the article of manufacture becomes a means for implementing the functions specified in the flowcharts' block(s) or operation(s). The program code instructions may be retrieved from a computer-readable storage medium and loaded into a computer, processor, or other programmable apparatus to configure the computer, processor, or other programmable apparatus to execute operations to be performed on or by the computer, processor, or other programmable apparatus. Retrieval, loading, and execution of the program code instructions may be performed sequentially such that one instruction is retrieved, loaded, and executed at a time. In some example embodiments, retrieval, loading and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together. Execution of the program code instructions may produce a computer- implemented process such that the instructions executed by the computer, processor, or other programmable apparatus provide operations for implementing the functions specified in the flowcharts' block(s) or operation(s).
[0060] Accordingly, execution of instructions associated with the blocks or operations of the flowchart by a processor, or storage of instructions associated with the blocks or operations of the flowcharts in a computer-readable storage medium, support combinations of operations for performing the specified functions. It will also be understood that one or more blocks or operations of the flowcharts, and combinations of blocks or operations in the flowcharts, may be implemented by special purpose hardware-based computer systems and/or processors which perform the specified functions, or combinations of special purpose hardware and program code instructions.
[0061] Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions other than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims

WHAT IS CLAIMED IS:
1. A method comprising:
receiving, at a device, a data entry field selection message notifying that data entry is desired; inhibiting a presentation of a data entry input interface on a display of the device; and causing a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
2. The method of claim 1, wherein the method further comprises generating an interface trigger event, local to the device, to present the data entry input interface on the display of the device in response to receiving the data entry field selection message; and
wherein inhibiting the presentation of the data entry input interface includes intercepting the interface trigger event to thereby inhibit the presentation of the data entry input interface on the display of the device.
3. The method of claim 2, wherein causing the remote interface trigger message to be sent includes causing the remote interface trigger message to be sent to the remote device in response to intercepting the interface trigger event.
4. The method of claim 1, wherein inhibiting the presentation of the data entry input interface is performed by implementing a terminal mode application on the device, the terminal mode application being configured to inhibit generation of an interface trigger event that would cause the presentation of the data entry input interface on the display of the device.
5. The method of claim 1, wherein the method further comprises generating an interface trigger event, local to the device, to present the data entry input interface on the display of the device in response to receiving the data entry field selection, the interface trigger event indicating a type of data entry input interface to be presented;
wherein inhibiting the presentation of the data entry input interface includes intercepting the interface trigger event to thereby inhibit the presentation of the data entry input interface on the display of the device; and
wherein causing the remote interface trigger message to be sent includes causing the remote interface trigger message to be sent to the remote device in response to intercepting the interface trigger event, the remote interface trigger message including the type of data entry input interface to be presented.
6. The method of claim 1, wherein causing the remote interface trigger message to be sent includes causing the remote interface trigger message to be sent to the remote device to direct the remote device to present a remote data entry input interface on a display of the remote device, the remote data entry input interface being presented based on context information of an environment in which the remote device is installed.
7. An apparatus comprising at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, direct the apparatus at least to:
receive, at a device, a data entry field selection message notifying that data entry is desired; inhibit a presentation of a data entry input interface on a display of the device; and
cause a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
8. The apparatus of claim 7, wherein the apparatus is further directed to generate an interface trigger event, local to the device, to present the data entry input interface on the display of the device in response to receiving the data entry field selection message; and
wherein the apparatus directed to inhibit the presentation of the data entry input interface includes being directed to intercept the interface trigger event to thereby inhibit the presentation of the data entry input interface on the display of the device.
9. The apparatus of claim 8, wherein the apparatus directed to cause the remote interface trigger message to be sent includes being directed to cause the remote interface trigger message to be sent to the remote device in response to intercepting the interface trigger event.
10. The apparatus of claim 7, wherein the apparatus directed to inhibit the presentation of the data entry input interface includes being directed to implement a terminal mode application on the device, the terminal mode application being configured to inhibit generation of an interface trigger event that would cause the presentation of the data entry input interface on the display of the device.
11. The apparatus of claim 7, wherein the apparatus is further configured to generate an interface trigger event, local to the device, to present the data entry input interface on the display of the device in response to receiving the data entry field selection, the interface trigger event indicating a type of data entry input interface to be presented; wherein the apparatus directed to inhibit the presentation of the data entry input interface includes being directed to intercept the interface trigger event to thereby inhibit the presentation of the data entry input interface on the display of the device; and
wherein the apparatus directed to cause the remote interface trigger message to be sent includes being directed to cause the remote interface trigger message to be sent to the remote device in response to intercepting the interface trigger event, the remote interface trigger message including the type of data entry input interface to be presented.
12. The apparatus of claim 7, wherein the apparatus directed to cause the remote interface trigger message to be sent includes being directed to cause the remote interface trigger message to be sent to the remote device to direct the remote device to present a remote data entry input interface on a display of the remote device, the remote data entry input interface being presented based on context information of an environment in which the remote device is installed.
13. The apparatus of claim 7, wherein the apparatus comprises the device, the device being a mobile communications device.
14. The apparatus of claim 13, wherein the apparatus further comprises a communications interface including an antenna, the communications interface being configured to establish connecting with the remote device.
15. A non-transitory computer readable medium having computer program code stored thereon, the computer program code being configured to, when executed, direct an apparatus to:
receive, at a device, a data entry field selection message notifying that data entry is desired; inhibit a presentation of a data entry input interface on a display of the device; and
cause a remote interface trigger message to be sent to a remote device to direct the remote device to present a remote data entry input interface on a display of the remote device.
16. The medium of claim 15, wherein the program code is further configured to direct the apparatus to generate an interface trigger event, local to the device, to present the data entry input interface on the display of the device in response to receiving the data entry field selection message; and wherein the program code configured to direct the apparatus to inhibit the presentation of the data entry input interface includes being configured to direct the apparatus to intercept the interface trigger event to thereby inhibit the presentation of the data entry input interface on the display of the device.
17. The medium of claim 16, wherein the program code configured to direct the apparatus to cause the remote interface trigger message to be sent includes being configured to direct the apparatus to cause the remote interface trigger message to be sent to the remote device in response to intercepting the interface trigger event.
18. The medium of claim 15, wherein the program code configured to direct the apparatus to inhibit the presentation of the data entry input interface includes being configured to direct the apparatus to implement a terminal mode application on the device, the terminal mode application being configured to inhibit generation of an interface trigger event that would cause the presentation of the data entry input interface on the display of the device.
19. The medium of claim 15, wherein the program code is further configured to direct the apparatus to generate an interface trigger event, local to the device, to present the data entry input interface on the display of the device in response to receiving the data entry field selection, the interface trigger event indicating a type of data entry input interface to be presented;
wherein the program code configured to direct the apparatus to inhibit the presentation of the data entry input interface includes being configured to direct the apparatus to intercept the interface trigger event to thereby inhibit the presentation of the data entry input interface on the display of the device; and wherein the program code configured to direct the apparatus cause the remote interface trigger message to be sent includes being configured to direct the apparatus to cause the remote interface trigger message to be sent to the remote device in response to intercepting the interface trigger event, the remote interface trigger message including the type of data entry input interface to be presented.
20. The medium of claim 15, wherein the program code configured to direct the apparatus to cause the remote interface trigger message to be sent includes being configured to direct the apparatus to cause the remote interface trigger message to be sent to the remote device to direct the remote device to present a remote data entry input interface on a display of the remote device, the remote data entry input interface being presented based on context information of an environment in which the remote device is installed.
PCT/FI2012/050733 2011-07-21 2012-07-16 Method and apparatus for triggering a remote data entry interface WO2013011197A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2014520694A JP6289368B2 (en) 2011-07-21 2012-07-16 Method and apparatus for triggering a remote data input interface
EP12814612.3A EP2735132B1 (en) 2011-07-21 2012-07-16 Method and apparatus for triggering a remote data entry interface
KR1020147004415A KR20140049000A (en) 2011-07-21 2012-07-16 Method and apparatus for triggering a remote data entry interface
CN201280045698.8A CN103828336B (en) 2011-07-21 2012-07-16 For the method and apparatus triggering teledata typing interface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/188,081 2011-07-21
US13/188,081 US10564791B2 (en) 2011-07-21 2011-07-21 Method and apparatus for triggering a remote data entry interface

Publications (2)

Publication Number Publication Date
WO2013011197A2 true WO2013011197A2 (en) 2013-01-24
WO2013011197A3 WO2013011197A3 (en) 2013-04-25

Family

ID=47556700

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2012/050733 WO2013011197A2 (en) 2011-07-21 2012-07-16 Method and apparatus for triggering a remote data entry interface

Country Status (6)

Country Link
US (1) US10564791B2 (en)
EP (1) EP2735132B1 (en)
JP (1) JP6289368B2 (en)
KR (1) KR20140049000A (en)
CN (1) CN103828336B (en)
WO (1) WO2013011197A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2818997A1 (en) * 2013-06-28 2014-12-31 BlackBerry Limited Generating message notifications providing direction actions
US9467824B2 (en) 2013-06-28 2016-10-11 Blackberry Limited Generating message notifications providing direction actions

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013027908A1 (en) * 2011-08-25 2013-02-28 Lg Electronics Inc. Mobile terminal, image display device mounted on vehicle and data processing method using the same
EP2766801A4 (en) * 2011-10-13 2015-04-22 Lg Electronics Inc Input interface controlling apparatus and method thereof
KR101891259B1 (en) * 2012-04-04 2018-09-28 삼성전자주식회사 Intelligent Output supporting Method for Event Information And Electro Device supporting the same
US9904461B1 (en) * 2013-03-14 2018-02-27 Parallels IP Holdings GmbH Method and system for remote text selection using a touchscreen device
US9380143B2 (en) * 2013-08-30 2016-06-28 Voxx International Corporation Automatically disabling the on-screen keyboard of an electronic device in a vehicle
KR102394202B1 (en) * 2015-05-29 2022-05-04 삼성전자주식회사 Method for processing input between devices and electronic device thereof
US10382927B2 (en) * 2015-08-20 2019-08-13 Samsung Electronics Co., Ltd. Method of text input for wearable devices
US10126945B2 (en) * 2016-06-10 2018-11-13 Apple Inc. Providing a remote keyboard service
CN116521299A (en) * 2016-08-14 2023-08-01 利维帕尔森有限公司 Method and apparatus for real-time remote control of mobile applications
WO2018160770A2 (en) * 2017-02-28 2018-09-07 Woods Michael E Communicator
US11243679B2 (en) * 2018-06-03 2022-02-08 Apple Inc. Remote data input framework

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009073806A2 (en) 2007-12-05 2009-06-11 Johnson Controls Technology Company Vehicle user interface systems and methods
WO2011073947A1 (en) 2009-12-18 2011-06-23 Nokia Corporation Method and apparatus for projecting a user interface via partition streaming

Family Cites Families (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7831930B2 (en) 2001-11-20 2010-11-09 Universal Electronics Inc. System and method for displaying a user interface for a remote control application
JP2000298634A (en) 1999-04-15 2000-10-24 Hitachi Ltd Information distribution system
US6721950B1 (en) 2000-04-06 2004-04-13 Microsoft Corporation Input redirection
KR100474724B1 (en) * 2001-08-04 2005-03-08 삼성전자주식회사 Apparatus having touch screen and external display device using method therefor
JP2003244343A (en) 2002-02-21 2003-08-29 Toyota Motor Corp Display device, portable terminal and information display system
JP2004170708A (en) 2002-11-20 2004-06-17 Sony Corp System, device, and method for window display, and program
DE602004003789T2 (en) 2004-02-26 2007-04-05 Alcatel Method for entering destination information via a mobile terminal
JP2005265572A (en) 2004-03-18 2005-09-29 Xanavi Informatics Corp Operation method for on-vehicle information terminal, on-vehicle information terminal, program for portable terminal, and portable phone
JP2006017478A (en) 2004-06-30 2006-01-19 Xanavi Informatics Corp Navigation system
JP2007025808A (en) 2005-07-12 2007-02-01 Canon Inc Virtual keyboard system and its control method
US20070050054A1 (en) * 2005-08-26 2007-03-01 Sony Ericssson Mobile Communications Ab Mobile communication terminal with virtual remote control
JP4695474B2 (en) 2005-09-21 2011-06-08 株式会社東芝 Composite video control apparatus, composite video control method, and program
US8191008B2 (en) 2005-10-03 2012-05-29 Citrix Systems, Inc. Simulating multi-monitor functionality in a single monitor environment
JP2008003093A (en) 2005-12-21 2008-01-10 Masahiro Izutsu Portable information communication apparatus, external output unit or i/o unit for use in portable information communication apparatus, on-board information communication system centered around portable information communication apparatus, and road traffic information providing system for providing on-board information communication system with information
US7606660B2 (en) * 2005-12-31 2009-10-20 Alpine Electronics, Inc. In-vehicle navigation system with removable navigation unit
JP4872451B2 (en) 2006-05-15 2012-02-08 トヨタ自動車株式会社 Vehicle input device
JP4361080B2 (en) * 2006-11-28 2009-11-11 インターナショナル・ビジネス・マシーンズ・コーポレーション Method, program, and apparatus for generating image data
WO2008079891A2 (en) * 2006-12-20 2008-07-03 Johnson Controls Technology Company Remote display reproduction system and method
TWI334569B (en) * 2007-05-15 2010-12-11 Ind Tech Res Inst System and method of dual-screen interactive digital television
EP2181426A1 (en) * 2007-07-25 2010-05-05 Intraco Technology Pte Ltd A content management and delivery system
JP2009035024A (en) 2007-07-31 2009-02-19 Fujitsu Ten Ltd On-board electronic system and device
JP4908360B2 (en) 2007-09-19 2012-04-04 株式会社東芝 Portable information terminal linkage system, linkage processing program, and linkage processing device
JP2009090690A (en) 2007-10-03 2009-04-30 Pioneer Electronic Corp Touch panel device
US20090156251A1 (en) * 2007-12-12 2009-06-18 Alan Cannistraro Remote control protocol for media systems controlled by portable devices
JP4241883B2 (en) * 2008-04-28 2009-03-18 ソニー株式会社 Text input device and method
WO2009143294A2 (en) * 2008-05-20 2009-11-26 Citrix Systems, Inc. Methods and systems for using external display devices with a mobile computing device
JP2009281991A (en) 2008-05-26 2009-12-03 Fujitsu Ten Ltd On-board display control apparatus and on-board display control method
US9716774B2 (en) * 2008-07-10 2017-07-25 Apple Inc. System and method for syncing a user interface on a server device to a user interface on a client device
TWI379592B (en) * 2008-12-31 2012-12-11 Mediatek Inc Display systems and methods
US9189124B2 (en) 2009-04-15 2015-11-17 Wyse Technology L.L.C. Custom pointer features for touch-screen on remote client devices
US10244056B2 (en) * 2009-04-15 2019-03-26 Wyse Technology L.L.C. Method and apparatus for transferring remote session data
JP5198355B2 (en) 2009-05-19 2013-05-15 株式会社日立製作所 Portable communication terminal, input / output device communicating with the same, system including these, program for remote operation of portable communication terminal
US9241062B2 (en) * 2009-05-20 2016-01-19 Citrix Systems, Inc. Methods and systems for using external display devices with a mobile computing device
US9014685B2 (en) * 2009-06-12 2015-04-21 Microsoft Technology Licensing, Llc Mobile device which automatically determines operating mode
JP4843696B2 (en) 2009-06-30 2011-12-21 株式会社東芝 Information processing apparatus and touch operation support program
US9690599B2 (en) 2009-07-09 2017-06-27 Nokia Technologies Oy Method and apparatus for determining an active input area
US8818624B2 (en) 2009-10-05 2014-08-26 Tesla Motors, Inc. Adaptive soft buttons for a vehicle user interface
KR101058525B1 (en) * 2009-10-09 2011-08-23 삼성전자주식회사 Text input method and display device using the same
TW201117982A (en) 2009-11-23 2011-06-01 Htc Corp Electronic system applied to a transport and related control method
JP5534161B2 (en) 2009-12-07 2014-06-25 アルパイン株式会社 User interface device
EP2553561A4 (en) * 2010-04-01 2016-03-30 Citrix Systems Inc Interacting with remote applications displayed within a virtual desktop of a tablet computing device
US8990702B2 (en) * 2010-09-30 2015-03-24 Yahoo! Inc. System and method for controlling a networked display
US20110267291A1 (en) * 2010-04-28 2011-11-03 Jinyoung Choi Image display apparatus and method for operating the same
KR101695810B1 (en) * 2010-05-07 2017-01-13 엘지전자 주식회사 Mobile terminal and method for controlling thereof
JP4818454B1 (en) * 2010-08-27 2011-11-16 株式会社東芝 Display device and display method
US9400585B2 (en) * 2010-10-05 2016-07-26 Citrix Systems, Inc. Display management for native user experiences
WO2012048087A2 (en) * 2010-10-06 2012-04-12 Citrix Systems, Inc. Mediating resource access based on a physical location of a mobile device
US20120256842A1 (en) * 2011-04-06 2012-10-11 Research In Motion Limited Remote user input
US10133361B2 (en) * 2011-06-06 2018-11-20 International Business Machines Corporation Device driver-level approach for utilizing a single set of interface input devices for multiple computing devices

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009073806A2 (en) 2007-12-05 2009-06-11 Johnson Controls Technology Company Vehicle user interface systems and methods
WO2011073947A1 (en) 2009-12-18 2011-06-23 Nokia Corporation Method and apparatus for projecting a user interface via partition streaming

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BOSE R. ET AL.: "Computer", vol. 44, 1 May 2011, IEEE, article "Morphing Smartphones into Automotive Application Platforms", pages: 53 - 61
JORG BRAKENSIEK, TERMINAL MODE TECHNICAL ARCHITECTURE, 1 January 2010 (2010-01-01), pages 1 - 87
RAJA BOSE ET AL.: "Terminal Mode - Transforming Mobile Devices into Automotive Application Platforms", PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON AUTOMOTIVE USER INTERFACES AND INTERACTIVE VEHICULAR APPLICATIONS, AUTOMOTIVEUI '10, 11 November 2010 (2010-11-11), pages 148
See also references of EP2735132A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2818997A1 (en) * 2013-06-28 2014-12-31 BlackBerry Limited Generating message notifications providing direction actions
US9467824B2 (en) 2013-06-28 2016-10-11 Blackberry Limited Generating message notifications providing direction actions
EP3693845A1 (en) * 2013-06-28 2020-08-12 BlackBerry Limited Generating message notifications providing direct actions

Also Published As

Publication number Publication date
EP2735132A4 (en) 2015-03-18
WO2013011197A3 (en) 2013-04-25
EP2735132A2 (en) 2014-05-28
US20130024777A1 (en) 2013-01-24
JP2014521179A (en) 2014-08-25
US10564791B2 (en) 2020-02-18
CN103828336A (en) 2014-05-28
JP6289368B2 (en) 2018-03-07
KR20140049000A (en) 2014-04-24
EP2735132B1 (en) 2020-09-09
CN103828336B (en) 2016-10-05

Similar Documents

Publication Publication Date Title
US10564791B2 (en) Method and apparatus for triggering a remote data entry interface
US11868539B2 (en) Display control method and apparatus
EP2735133B1 (en) Method and apparatus for providing data entry content to a remote environment
US20220413670A1 (en) Content Sharing Method and Electronic Device
KR102024187B1 (en) User terminal device and method for displaying thereof
US20110214162A1 (en) Method and appartus for providing cooperative enablement of user input options
KR20150047451A (en) Method, apparatus and terminal device for displaying messages
US11455075B2 (en) Display method when application is exited and terminal
KR20150046765A (en) Method, apparatus and terminal device for selecting character
CN108780400B (en) Data processing method and electronic equipment
WO2015014138A1 (en) Method, device, and equipment for displaying display frame
CN108491125B (en) Operation control method of application store and mobile terminal
US20160353407A1 (en) Methods and systems for notification management between an electronic device and a wearable electronic device
US10496190B2 (en) Redrawing a user interface based on pen proximity
US10572147B2 (en) Enabling perimeter-based user interactions with a user device
KR20120010529A (en) Method for multiple display and mobile terminal using this method
CN107317919B (en) Communication message reply method and device and mobile terminal
KR102479448B1 (en) Electronic apparatus and operating method thereof

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2012814612

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2014520694

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20147004415

Country of ref document: KR

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12814612

Country of ref document: EP

Kind code of ref document: A2