US20130027315A1 - Techniques to display an input device on a mobile device - Google Patents

Techniques to display an input device on a mobile device Download PDF

Info

Publication number
US20130027315A1
US20130027315A1 US13/189,706 US201113189706A US2013027315A1 US 20130027315 A1 US20130027315 A1 US 20130027315A1 US 201113189706 A US201113189706 A US 201113189706A US 2013027315 A1 US2013027315 A1 US 2013027315A1
Authority
US
United States
Prior art keywords
mobile device
touch sensitive
display
digital input
integrated touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/189,706
Inventor
Arther Sing Hook Teng
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Corp
Original Assignee
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corp filed Critical Intel Corp
Priority to US13/189,706 priority Critical patent/US20130027315A1/en
Assigned to INTEL CORPORATION reassignment INTEL CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TENG, Arther Sing Hook
Priority to CN201710941245.8A priority patent/CN107678675B/en
Priority to PCT/US2012/048056 priority patent/WO2013016383A1/en
Priority to CN201280036752.2A priority patent/CN103703452A/en
Publication of US20130027315A1 publication Critical patent/US20130027315A1/en
Priority to US15/854,108 priority patent/US10956020B2/en
Priority to US17/206,855 priority patent/US11237721B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1698Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a sending/receiving arrangement to establish a cordless communication link, e.g. radio or infrared link, integrated cellular phone

Definitions

  • a user may couple a mobile device to an external display, such as a liquid crystal display (LCD) monitor, via a wired or wireless connection.
  • the external display may have a larger amount of display area relative to the integrated display of the mobile device.
  • content presented on the smaller integrated display of a mobile device may be replicated to the larger external display to enhance viewing by a user.
  • the smaller integrated display of the mobile device typically continues to present the same view as that shown on the larger external display. This seems redundant given that one purpose of the larger external display is to become a primary viewing device for the mobile device. In some cases, the smaller integrated display may be turned off to conserve power to the mobile device. In both cases, however, the smaller integrated display remains an underutilized resource of the mobile device. It is with respect to these and other limitations that the present improvements are needed.
  • FIG. 1 illustrates an embodiment of an exemplary mobile system.
  • FIG. 2 illustrates an embodiment of a system within a mobile device.
  • FIGS. 3A-C illustrate embodiments of exemplary mobile systems.
  • FIG. 4 illustrates exemplary input devices presented on an integrated touch sensitive display of a mobile device.
  • FIG. 5 illustrates an embodiment of a first exemplary logic flow.
  • FIG. 6 illustrates an embodiment of a second exemplary logic flow.
  • FIG. 7 illustrates an embodiment of an exemplary computing architecture.
  • Embodiments are generally directed to techniques for managing multiple display devices coupled to a mobile device. Some embodiments are particularly directed to techniques for presenting different content on each display device coupled to a mobile device.
  • a mobile device may detect an external display device.
  • the mobile device may be coupled to the external display device via wired or wireless connections.
  • the mobile device may comprise an integrated touch sensitive display.
  • the mobile device may implement a display manager arranged to present different content on each of the display devices, rather than identical content as presented by conventional solutions.
  • the mobile device may detect an external display device.
  • a display manager may present a digital input device on an integrated touch sensitive display of a mobile device, while simultaneously presenting multimedia content from an application on an external display coupled to the mobile device.
  • the digital input device presented on an integrated touch sensitive display of a mobile device may operate the mobile device.
  • a digital input device may comprise, for example, a software version of a keyboard, a pointing device, a touch pad, or any other type of physical input device.
  • the integrated touch sensitive display may be used to replicate functions of a physical input device, without necessarily needing to connect any physical input devices to the mobile device.
  • an integrated touch sensitive display for a mobile device may be utilized as a custom input device for the mobile device, rather than simply presenting identical content as that presented on an external display device.
  • an integrated touch sensitive display is converted from a wasted resource to a productive resource of a mobile device.
  • multimedia may be presented on the external display.
  • the multimedia may be presented in response to the operation of the mobile device through the digital input device.
  • a physical input device may be coupled to a mobile device.
  • the integrated touch sensitive display may be used to augment functions of the physical input device.
  • a physical pointing device such as a mouse may be coupled to a mobile device, and used to select content presented on an external display.
  • the external display may not provide sufficient resolution or granularity to specifically pin-point content presented on the external display.
  • the integrated touch sensitive display may provide a more detailed version of an area around a pointer positioned on the external display, thereby allowing a user to select a more precise location within the detailed version by tapping on the integrated touch sensitive display at the desired location.
  • Other use scenarios exist as well.
  • Various embodiments may comprise one or more elements.
  • An element may comprise any structure arranged to perform certain operations.
  • Each element may be implemented as hardware, software, or any combination thereof, as desired for a given set of design parameters or performance constraints.
  • an embodiment may be described with a limited number of elements in a certain topology by way of example, the embodiment may include more or less elements in alternate topologies as desired for a given implementation.
  • any reference to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
  • the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
  • FIG. 1 illustrates a block diagram of one embodiment of a mobile system 100 .
  • the mobile system 100 may comprise multiple nodes.
  • a node generally may comprise any physical or logical entity for communicating information in the mobile system 100 and may be implemented as hardware, software, or any combination thereof, as desired for a given set of design parameters or performance constraints.
  • An example of a node may comprise an electronic device, such as a mobile device or an external display.
  • Another example of a node may comprise a part or component of an electronic device, such as a display adapter or wireless transceiver.
  • FIG. 1 may show a limited number of nodes by way of example, it can be appreciated that more or less nodes may be employed for a given implementation.
  • the mobile system 100 may comprise, or form part of a wired communications system, a wireless communications system, or a combination of both.
  • the mobile system 100 may include one or more nodes arranged to process and/or communicate information over one or more types of wired communication links.
  • Examples of a wired communication link may include, without limitation, a wire, cable, bus, printed circuit board (PCB), Ethernet connection, peer-to-peer (P2P) connection, backplane, switch fabric, semiconductor material, twisted-pair wire, co-axial cable, fiber optic connection, and so forth.
  • the mobile system 100 also may include one or more nodes arranged to communicate information over one or more types of wireless communication links, such as shared media 160 .
  • Examples of a wireless communication link may include, without limitation, a radio channel, infrared channel, radio-frequency (RF) channel, a High-Definition Multimedia Interface (HDMI) channel, a digital visual interface (DVI) channel, a video graphics array (VGA) channel, a Wireless Fidelity (WiFi) channel, a Wireless Display (WiDi) channel, a portion of the RF spectrum, and/or one or more licensed or license-free frequency bands.
  • RF radio-frequency
  • HDMI High-Definition Multimedia Interface
  • DVI digital visual interface
  • VGA video graphics array
  • WiFi Wireless Fidelity
  • WiDi Wireless Display
  • the wireless nodes may include one or more wireless interface systems and/or components for wireless communication, such as one or more radios, transmitters, receivers, transceivers, chipsets, amplifiers, filters, control logic, network interface cards (NICs), antennas, antenna arrays, and so forth.
  • Examples of an antenna may include, without limitation, an internal antenna, an omni-directional antenna, a monopole antenna, a dipole antenna, an end fed antenna, a circularly polarized antenna, a micro-strip antenna, a diversity antenna, a dual antenna, an antenna array, and so forth.
  • certain devices may include antenna arrays of multiple antennas to implement various adaptive antenna techniques and spatial diversity techniques.
  • the mobile system 100 may comprise or be implemented as a mobile broadband communications system.
  • mobile broadband communications systems include, without limitation, systems compliant with various Institute of Electrical and Electronics Engineers (IEEE) standards, such as the IEEE 802.11 standards for Wireless Local Area Networks (WLANs) and variants, the IEEE 802.16 standards for Wireless Metropolitan Area Networks (WMANs) and variants, and the IEEE 802.20 or Mobile Broadband Wireless Access (MBWA) standards and variants, among others.
  • IEEE Institute of Electrical and Electronics Engineers
  • the mobile system 100 may be implemented in accordance with the Worldwide Interoperability for Microwave Access (WiMAX) or WiMAX II standard.
  • WiMAX Worldwide Interoperability for Microwave Access
  • WiMAX II Worldwide Interoperability for Microwave Access
  • WiMAX is a wireless broadband technology based on the IEEE 802.16 standard of which IEEE 802.16-2004 and the 802.16e amendment (802.16e Cor2/D3-2005) are Physical (PHY) layer specifications.
  • WiMAX II is an advanced Fourth Generation (4G) system based on the IEEE 802.16m and IEEE 802.16j proposed standards for International Mobile Telecommunications (IMT) Advanced 4G series of standards.
  • the mobile system 100 may comprise an external display device 120 .
  • the external display device 120 may include, for example a wireless interface system 155 to allow the external display device 120 to communicate in the mobile system 100 .
  • an external display device 120 may include wireless capabilities.
  • the external display device 120 may include a display 165 .
  • the display 165 may include, but is not limited to, a plasma, a liquid crystal display (LCD), an organic light emitting diode (OLED) display and/or a red/green/blue (RGB) display, among others.
  • the display 165 on the external display device 120 may present multimedia content.
  • the multimedia content may include, but is not limited to, text, audio, video, symbols, images and/or animation.
  • the mobile system 100 may comprise a mobile device 110 .
  • the mobile device 110 may include an integrated touch sensitive display 115 , a processor 130 , a memory unit 140 , and a wireless interface system 150 .
  • the embodiments, however, are not limited to the elements shown in FIG. 1 .
  • An integrated touch sensitive display 115 on the mobile device 110 may comprise any suitable display unit for displaying information on a device.
  • the integrated touch sensitive display 115 may include an organic light emitting diode (OLED) display, a liquid crystal display (LCD), or other glass and/or plastic materials.
  • the integrated touch sensitive display 115 may provide high brightness and/or contrast and a wide aspect ratio.
  • the integrated touch sensitive display 115 may display text, symbols and/or images.
  • the integrated touch sensitive display 115 may include a monochromatic display.
  • the integrated touch sensitive display 115 may include a red/green/blue (RGB) display.
  • the integrated touch sensitive display 115 may be suitable to present one or more graphical user interface (GUI) views generated by a GUI component of an application program or system program (e.g., an operating system).
  • GUI graphical user interface
  • the integrated touch sensitive display 115 may be implemented as a touch screen, touch panel, touch screen panel, and so forth.
  • Touch screens may comprise display overlays which are implemented using one of several different techniques, such as pressure-sensitive (resistive) techniques, electrically-sensitive (capacitive) techniques, acoustically-sensitive (surface acoustic wave) techniques, photo-sensitive (infra-red) techniques, and so forth.
  • integrated touch sensitive display 115 may be implemented by a liquid crystal display (LCD), plasma, projection screen or other type of suitable visual interface.
  • LCD liquid crystal display
  • the integrated touch sensitive display 115 may to determine when a user is touching the integrated touch sensitive display 115 .
  • the integrated touch sensitive display 115 may receive information via a physical touch, such as, but not limited to, a touch from a virtual pen or a user's finger. The embodiments are not limited in this context.
  • the mobile device 110 may comprise a processor 130 .
  • the processor 130 may be implemented as any processor, such as a complex instruction set computer (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a processor implementing a combination of instruction sets, or other processor device.
  • the processor 130 may be implemented as a general purpose processor, such as a processor made by Intel® Corporation, Santa Clara, Calif.
  • the processor 130 may be implemented as a dedicated processor, such as a controller, microcontroller, embedded processor, a digital signal processor (DSP), a network processor, a media processor, an input/output (I/O) processor, and so forth.
  • DSP digital signal processor
  • I/O input/output
  • the mobile device 110 may comprise a memory unit 140 .
  • the memory unit 140 may comprise any machine-readable or computer-readable media capable of storing data, including both volatile and non-volatile memory.
  • the memory unit 140 may include read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, or any other type of media suitable for storing information.
  • ROM read-only memory
  • RAM random-access memory
  • DRAM dynamic RAM
  • DDRAM Double-Data-Rate DRAM
  • SDRAM synchronous DRAM
  • SRAM static RAM
  • the memory unit 140 may be included on the same integrated circuit as the processor 130 , or alternatively some portion or all of the memory unit 140 may be disposed on an integrated circuit or other medium, for example a hard disk drive, that is external to the integrated circuit of the processor 130 .
  • the memory unit 140 may include data and instructions to operate the processor. The embodiments are not limited in this context.
  • the mobile device 110 and the external display device 120 may communicate information over shared media 160 via respective wireless interface systems 150 , 155 .
  • the shared media 160 may comprise one or more allocations of RF spectrum.
  • the allocations of RF spectrum may be contiguous or non-contiguous.
  • the wireless interface systems 150 , 155 may communicate information over the shared media 160 using various multicarrier techniques utilized by, for example, WiFi, WiDi, general packet radio service (GPRS), long term evolution (LTE) technologies, WiMAX and/or WiMAX II systems.
  • GPRS general packet radio service
  • LTE long term evolution
  • WiMAX WiMAX
  • the wireless interface system 150 , 155 may communicate information using one or more communications channels.
  • a communication channel may be a defined set of frequencies, time slots, codes, or combinations thereof.
  • the mobile device 110 may be coupled to the external display device 120 over the shared media 160 .
  • One or more logical or physical channels may be established to communicate information.
  • the information may include media information and control information.
  • Media information may refer to any data representing content meant for a user. Examples of content may include, for example, data from a voice conversation, videoconference, streaming video, electronic mail (“e-mail”) message, voice mail message (“voice message”), alphanumeric symbols, graphics, image, video, text and so forth.
  • Data from a voice conversation may be, for example, speech information, silence periods, background noise, comfort noise, tones and so forth.
  • Control information may refer to any data representing commands, instructions or control words meant for an automated system. For example, control information may be used to route media information through a system, or instruct a node to process the media information in a predetermined manner.
  • FIG. 2 illustrates a system within the mobile device 200 .
  • a mobile device 200 may include a logic device 205 .
  • the logic device 205 may comprise various hardware elements, software elements, or a combination of both. Examples of hardware elements may include devices, components, processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), memory units, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
  • ASIC application specific integrated circuits
  • PLD programmable logic devices
  • DSP digital signal processors
  • FPGA field programmable gate array
  • Examples of software elements may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints, as desired for a given implementation.
  • the logic device 205 may be operative to detect that an external display is coupled to the mobile device. In an embodiment, the logic device 205 may be operative to detect that a physical external input device is coupled to the mobile device. In an embodiment, the external display and/or the physical external input device may be coupled via wires or wirelessly.
  • the logic device 205 may include a display manager 210 .
  • the display manager may be operative on the logic device.
  • the display manager 210 may manage content presented on multiple displays.
  • the display manager 210 may manage content on the integrated touch sensitive display for the mobile device.
  • the display manager 210 may manage content on the external display for the mobile device.
  • the display manager 210 may present multimedia content on an external display.
  • multimedia content may include, but is not limited to, text, audio video, symbols, images and/or animation.
  • the display manager may use a graphical user interface (GUI) 215 to present a digital input device on an integrated touch sensitive display for the mobile device 200 .
  • GUI graphical user interface
  • the GUI component 215 may generate one or more GUI views on the integrated touch sensitive display.
  • the GUI component 215 may provide one or more images on the integrated touch sensitive display of the mobile device with which a user can interact.
  • FIGS. A-C illustrate block diagrams of possible embodiments of a mobile system 300 .
  • FIG. 3A illustrates a block diagram of an embodiment of a mobile system 300 .
  • the mobile system 300 may include a mobile device 310 and an external display 320 for the mobile device 310 .
  • the external display 320 may be coupled to the mobile device 310 via a connection 360 .
  • the connection 360 may be over the shared media 160 , as shown in FIG. 1 .
  • the connection 360 may be a wireless connection.
  • the mobile device 310 may connect 360 to the external display 320 using a Wireless Display (WiDi) channel. Examples of an external display 320 may include, but are not limited to, a television screen and/or a computer monitor.
  • WiDi Wireless Display
  • a mobile device 310 may include, but is not limited to, a laptop, a notebook, a handheld computer, a handheld enclosure, a portable electronic device, a mobile internet device (MID), a tablet, a computing device with a touch screen, a slate and/or a personal digital assistant. The embodiments, however, are not limited to these examples.
  • the mobile device 310 may include an integrated display 312 .
  • the integrated touch sensitive display 312 may be implemented as, but is not limited to, a touch screen, touch panel and/or touch screen panel.
  • the external display 320 may replicate the multimedia content currently displayed on the integrated touch sensitive display 312 of the mobile device 310 .
  • the multimedia content may include, but is not limited to, text, images, audio and video.
  • the combination of the external display 320 and the integrated touch sensitive display 312 of the mobile device 310 can mimic a desktop computer being docked with various input/output devices.
  • the external display 320 may function as the primary display while the integrated touch sensitive display 312 of the mobile device 310 may function as an input device.
  • An input device is a device which provides information to a processing device.
  • Input devices may include, but are not limited to, a keyboard, a pointing device, a touch pad, a composite device, an imaging device and/or a video device.
  • Further examples of input devices may include, but are not limited to, a mouse, a track pad, a digital camera, digital recorder, a webcam, a microphone, a scanner, a barcode reader, a pointing stick, a virtual pen, a joystick, a controller, a game controller, a remote, a computer keyboard, a trackball and/or any other type of physical input device.
  • the embodiments are not limited to these examples.
  • the integrated touch sensitive display 312 of a mobile device 310 may display an input device.
  • the input device presented on the integrated touch sensitive display 312 may be referred to as a digital input device 370 .
  • the digital input device 370 may comprise hardware and/or software to replicate the functions of an external physical input device.
  • the digital input device 370 may mimic the functionality of the input device presented on the integrated touch sensitive display 312 .
  • a digital input device 370 presented on the integrated touch sensitive display 312 of the mobile device 310 may provide information to the external display 320 . Using the digital input device 370 may reduce and/or eliminate the need to couple the mobile device 310 to external physical input devices.
  • an integrated touch sensitive display 312 of the mobile device 310 may display a keyboard as a digital input device 370 .
  • a keyboard may be presented on the integrated touch sensitive display to operate the mobile device and the external display may present multimedia content in response to operation of the mobile device through the keyboard.
  • the letter may appear on the external display 320 as if the user touched that letter on an external keyboard device.
  • an integrated touch sensitive display 312 of the mobile device 310 may display a trackball as a digital input device 370 .
  • a trackball may be presented on the integrated touch sensitive display to operate the mobile device and the external display may present multimedia content in response to operation of the mobile device through the trackball.
  • a pointer from the location of the trackball may appear on the external display 320 .
  • an integrated touch sensitive display 312 of the mobile device 310 may display a video remote with one or more of a stop, start, fast forward, rewind and/or play button.
  • a video remote may be presented on the integrated touch sensitive display to operate the mobile device and the external display may present multimedia content in response to operation of the mobile device through the video remote.
  • the video remote may stop, start, fast forward, rewind and/or play on the external display 320 based on the user's touch.
  • the mobile device 310 may function as the input device presented on the integrated touch sensitive display.
  • the input device presented on the integrated touch sensitive display 312 of a mobile device 310 may be a keyboard.
  • the keyboard presented on the integrated touch sensitive display of the mobile device may have the same functionality of an actual keyboard. For example, a user may touch the caps lock key and a letter key presented on the integrated touch sensitive display in order to capitalize a letter. Alternatively or in addition, a user may press multiple keys simultaneously to replicate a keyboard function, such as holding the shift key to capitalize a letter.
  • the mobile device 310 may mimic the functionality of an external physical input device via one or more visual images, sounds and/or vibrations.
  • the mobile device 310 may provide sensory output such as, but not limited to, output for a user's sense of sight, hearing and touch.
  • the sensory output from the integrated touch sensitive display 312 of the mobile device 310 may assist the user in using the digital input device 370 .
  • the integrated touch sensitive display 312 of the mobile device 310 may mimic the functionality of an external physical input device via a visual depiction.
  • the integrated touch sensitive display 312 of the mobile device 310 may display a keyboard as a digital input device 370 .
  • the integrated touch sensitive display 312 may mimic the keyboard via the visual depiction of the various buttons of a full sized keyboard device.
  • a user may touch a key on the integrated touch sensitive display 312 to mimic the pressing of one or more keys.
  • the integrated touch sensitive display 312 may mimic a keyboard by displaying a depressed key when the presented key is touched by a user.
  • the integrated touch sensitive display 312 of the mobile device 310 may display a mouse with a scroll wheel.
  • the integrated touch sensitive display 312 of the mobile device 310 may show a visual image of where the mouse previously moved on the integrated touch sensitive display 312 .
  • the integrated touch sensitive display 312 may depict a moving scroll wheel when a user touches the displayed scroll wheel. The examples are not limited to the embodiments described.
  • the mobile device 310 may mimic the functionality of an external physical input device via one or more sounds. For example, if the digital input device 370 presented is a keyboard, the mobile device 310 may provide a clicking sound when a user touches a key presented on the integrated touch sensitive display 312 . If the digital input device 370 presented is a mouse, the mobile device 310 may provide a clicking sound when a user touches the left and/or right buttons depicted on the displayed mouse. Alternatively or in addition, the mobile device 310 may provide a noise when a user moves a mouse around the integrated touch sensitive display 312 . If the digital input device 370 presented is a barcode reader, the mobile device 310 may provide a beeping sound when a barcode is read. The examples are not limited to the embodiments described.
  • the integrated touch sensitive display 312 of the mobile device 310 may provide a vibration or sensation to a user when a user touches a part of the presented digital input device 370 .
  • the mobile device 310 may mimic the functionality of the input device via a touch or tactile sensation.
  • the mobile device may contain a motor and or other tactile technology in order to provide the sensation to a user. For example, if the digital input device 370 presented is a track pad, the mobile device 310 may provide the user with a sensation of the ball of a tack pad rolling across the integrated touch sensitive display 212 .
  • the mobile device 310 may provide a vibration when the left and/or right buttons presented on the integrated touch sensitive display 312 are touched by a user. Alternatively or in addition, the mobile device 310 may provide a different vibration when a scrolling wheel of the mouse is touched by a user. If the input device presented is a keyboard, the mobile device 310 may provide a vibration when a key on the integrated touch sensitive display 312 is touched.
  • the examples are not limited to the embodiments described.
  • the integrated touch sensitive display 312 to replicate functions of an external physical input device may eliminate the need to connect one or more physical input devices to the mobile device 310 .
  • the integrated touch sensitive display 312 for a mobile device 310 may be utilized as a custom input device for the mobile device 310 and may be a productive resource of a mobile device 310 .
  • FIG. 3B illustrates a block diagram of an embodiment of a mobile system 300 .
  • a mobile system 300 may include multiple external displays 320 A, 320 B, 320 C coupled to a mobile device 310 .
  • the external display 320 A, 320 B, 320 C may be coupled to the mobile device 310 via a wired or wireless connection.
  • one or more of the multiple external displays 320 A, 320 B, 320 C may be coupled to the mobile device via a Wireless Display (WiDi) channel.
  • WiDi Wireless Display
  • one or more of external displays may present multimedia content.
  • one or more of the external displays 320 A, 320 B, 320 C may present the same multimedia content simultaneously.
  • various external displays 320 A, 320 B, 320 C may present the same multimedia content simultaneously so that users in a large ballroom may view the content, such as a presentation, from wherever they are seated.
  • one or more external display 320 A, 320 B, 320 C may present at least a portion of the multimedia content.
  • multiple displays may be used together to form one large display presenting the multimedia content. The examples are not limited to the embodiments described.
  • FIG. 3C illustrates a block diagram of an embodiment of a mobile system 300 .
  • an external input 380 may be coupled to a mobile device 310 .
  • the external input 380 may be a physical input device.
  • the external input 380 may be coupled to the mobile device via a wired or wireless connection.
  • the external input 380 may be wirelessly coupled to the mobile device using the connection that coupled the mobile device 310 to the external display 320 .
  • the integrated touch sensitive display 312 on the mobile device 310 may be used to augment and/or enhance functions of the external input 380 .
  • an external input 380 may be a physical pointing device such as a virtual pen.
  • the virtual pen 380 may be coupled to a mobile device 310 and may be used to select content presented on an external display 320 .
  • the external display 320 may not provide sufficient resolution and/or granularity to specifically pin-point content presented on the external display 320 .
  • the digital input device 370 on the integrated touch sensitive display 312 of the mobile device 310 may provide a more detailed version of an area around a pointer positioned on the external display 370 . This may allow a user to select a more precise location within the detailed version by tapping on the input device 370 on the integrated touch sensitive display 312 at the desired location.
  • the integrated touch sensitive display 312 on the mobile device 310 may be used as a supplemental device along with the external input 380 .
  • a computer workstation often has multiple external devices such as a keyboard and a mouse.
  • the mobile system 300 may have a physical keyboard as an external input 280 and the input device 370 on the integrated touch sensitive display 312 of the mobile device 310 may be a mouse.
  • a computer workstation may be easily simulated without the need for multiple external devices. The examples are not limited to the embodiments described.
  • FIG. 4 illustrates exemplary digital input devices 425 A, 425 B, 425 C presented on an integrated touch sensitive display 410 of a mobile device.
  • the integrated touch sensitive display 410 on the mobile device 405 may alternate between various digital input devices 425 A, 425 B, 425 C and the multimedia content 415 while coupled to an external display of the mobile device 405 .
  • the mobile device 405 may present a keyboard 425 A as the digital input device 425 on the integrated touch sensitive display 410 .
  • the integrated touch sensitive display 410 of the mobile device 405 may automatically present a keyboard 425 A as the digital input device 425 when the mobile device 405 is coupled to an external display.
  • the integrated touch sensitive display 410 of the mobile device 405 may wait to receive information about the digital input device 425 .
  • the information may cause the integrated touch sensitive display 410 on the mobile device 405 to display a digital input device 425 .
  • the information may cause the mobile device to function as the digital input device 425 .
  • a user may select one or more digital input devices 425 A, 425 B, 425 C to be presented on the integrated touch sensitive display 410 from a set, list and/or group of input devices.
  • the mobile device 405 may receive information that a user selected a keyboard 425 A to be presented as the digital input device 425 .
  • the mobile device 405 may present a keyboard 425 A as the digital input device 425 on the integrated touch sensitive display 410 and the mobile device 405 may function as a keyboard.
  • a user may want to change the digital input device 425 presented on the integrated touch sensitive display 410 of the mobile device 405 .
  • the integrated touch sensitive display 410 on the mobile device 405 may switch from displaying a first digital input device 425 to displaying a second digital input device 425 .
  • a user may want to change from the keyboard 425 A as the digital input device 425 to a track pad 425 B as the digital input device 425 .
  • the digital input device 425 displayed on the integrated touch sensitive display 410 may switch because an external device has been added to the mobile system.
  • a user may want to change to include multiple digital input devices 425 on the integrated touch sensitive display 410 of the mobile device 405 .
  • a user may want to change to an integrated touch sensitive display 410 including a keyboard with a track pad 425 C as the digital input device 425 .
  • an integrated touch sensitive display 410 on the mobile device 405 may switch back to being the primary display and may present at least a portion of the multimedia content 415 which is presented on the external display.
  • the integrated touch sensitive display 410 on the mobile device 405 switches to display the multimedia content 415
  • the integrated touch sensitive display 410 and the external display may have the same display.
  • an integrated touch sensitive display 410 on the mobile device 405 may present at least a portion of the multimedia content 415 which is presented on the external display.
  • the digital input device 425 may present a more detailed version of a portion of the media content 415 displayed on the external display screen.
  • FIG. 5 illustrates an embodiment of a first exemplary logic flow.
  • Logic flow 500 may be representative of the operations executed by one or more embodiments described herein. It should be noted that the methods described herein do not have to be executed in the order described, or in any particular order. Moreover, various activities described with respect to the methods identified herein can be executed in serial or parallel fashion.
  • an external display coupled to a mobile device may be detected 505 .
  • the mobile device may detect 505 that one or more external displays are coupled via a logic device 205 , shown in FIG. 2 .
  • the logic device may detect that the mobile device is coupled to an external display via a wired connection. Examples of a wired connection include, but are not limited to, cables and/or cords.
  • the logic device may detect that the mobile device is coupled to an external display via a wireless connection. Examples of a wireless connection include, but are not limited to, Wireless Fidelity (WiFi) and/or Wireless Display (WiDi). The embodiments, however, are not limited to these examples.
  • the display manager 210 in the mobile device may manage 507 the multiple display devices.
  • the mobile device may manage 507 an integrated touch sensitive display of the mobile device.
  • the mobile device may manage 507 one or more external displays.
  • a digital input device may be presented 510 on the integrated touch sensitive display of the mobile device to operate the mobile device.
  • the integrated touch sensitive display of the mobile device may be a touch screen.
  • a keyboard may be presented as the digital input device. The keyboard displayed on the integrated touch sensitive display may be used to operate the mobile device.
  • the embodiments, however, are not limited to this example.
  • multimedia content may be presented 515 to the external display of the mobile device in response to the operation of the mobile device through the digital input device.
  • the multimedia content may be an application executed, or running on, the mobile device.
  • multimedia content may include, but is not limited to, text, audio, video and images.
  • the multimedia content may be presented on the integrated touch sensitive display of the mobile device prior to the mobile device being coupled to the external display. After the mobile device and the external display are coupled, the external display may present 515 the multimedia content. By presenting the multimedia content, the external display may become the primary display.
  • the digital input device presented 510 on the integrated touch sensitive display of the mobile device may be different than the media content presented 515 on the external display.
  • the integrated touch sensitive display of the mobile device may present 510 a digital input device used to operate the mobile device.
  • the mobile device may detect 505 multiple external displays coupled to the mobile device.
  • one or more of the external displays may present 515 the multimedia content in response to operation of the mobile device through the digital input device.
  • the external display may be a monitor and multiple monitors may present 515 the multimedia content.
  • one or more external displays may present 515 the same multimedia content simultaneously. For example, in a large ballroom, users may view various screens presenting 515 the same content, such as a movie.
  • FIG. 6 illustrates an embodiment of a second exemplary logic flow.
  • Logic flow 600 may be representative of the operations executed by one or more embodiments described herein. It should be noted that the methods described herein do not have to be executed in the order described, or in any particular order. Moreover, various activities described with respect to the methods identified herein can be executed in serial or parallel fashion.
  • the mobile device may receive 615 information about a digital input device. In an embodiment, the mobile device may determine whether it received 615 information about a digital input device. In an embodiment, a time threshold may be used. The time threshold may be 5 seconds, 30 seconds, 1 minute, 2 minutes and or other time values. If no information is received during the threshold, then the mobile device may determine that no information was received 615 .
  • the integrated touch sensitive display may automatically present 620 a digital input device on the integrated touch sensitive display of the mobile device.
  • the integrated touch sensitive display of the mobile device may automatically present 620 a keyboard as the digital input device.
  • the integrated touch sensitive display of the mobile device may automatically present 620 a mouse as the digital input device.
  • the digital input device that is automatically presented 620 may be previously determined.
  • the digital input device that is automatically presented 620 may be randomly chosen from the one or more digital input devices that can be presented on the user interface device.
  • a user may select a digital input device to present 625 on the integrated touch sensitive display of the mobile device.
  • the mobile device may receive 615 information about a digital input device and may present 625 the digital input device on the integrated touch sensitive display of the mobile device based on the received information.
  • a user may provide the information received 615 by the mobile device about a digital input device by selecting a digital input device. For example, a user may touch a certain place on the integrated touch sensitive display which may cause the integrated touch sensitive display to display various digital input devices.
  • the integrated touch sensitive display may present a set, list and/or group of digital input devices.
  • the integrated touch sensitive display in addition to the set, list or group of one or more digital input devices, may also list an option of displaying all or a portion of the multimedia content.
  • a user may provide the information received 615 by the mobile device by simply touching the integrated touch sensitive display.
  • the user may touch the integrated touch sensitive display of the mobile device with his/her finger to make a selection from a group, list and/or set.
  • the user may use a stylus or virtual pen to make a selection. The examples are not limited to the embodiments described.
  • the mobile device may receive 615 information about a digital input device by receiving information that an external physical device was added to the mobile system.
  • the integrated touch sensitive display on the mobile system may present a digital input device which provides a more detailed version of a part of the external display screen.
  • the integrated touch sensitive display on the mobile system may present a different digital input device.
  • the digital input device may complement the external physical device. For example, when a keyboard is coupled to a mobile device, the integrated touch sensitive display may present a mouse.
  • a user may switch the digital input device presented on the integrated touch sensitive display of the mobile device.
  • a user may hot swap the digital input device presented on the integrated touch sensitive display of the mobile device.
  • the mobile device may receive 615 information about a digital input device after a digital input device is automatically presented 620 on the integrated touch sensitive display of the mobile device. For example, a full size keyboard may be presented on the display. Information may be received 615 and a track pad may be hot swapped so that the tack pad is displayed on the integrated touch sensitive display. Additionally and/or alternatively, a mobile device may receive 615 information about a first digital input device and then receive 615 information about a second digital input device.
  • the mobile device may change the integrated touch sensitive display to present 625 the input device based on the received information.
  • the integrated touch sensitive display of the mobile device may automatically present 620 a keyboard when the mobile device is coupled to the external display.
  • a user may want the integrated touch sensitive display to present 625 a track pad as the digital input device.
  • the mobile device may receive information 615 about a digital input device.
  • the mobile device may receive 615 information that a user selected a track pad as the digital input device.
  • the mobile device may present 625 the tack pad on the integrated touch sensitive display.
  • the mobile device may receive 615 information that the integrated touch sensitive display should present all or a portion of the multimedia content.
  • the mobile device may present all or a portion of the multimedia content on the integrated touch sensitive display.
  • an enlarged or more detailed portion of the multimedia content may be presented on the integrated touch sensitive display as the digital input device.
  • FIG. 7 illustrates an embodiment of an exemplary computing architecture 700 suitable for implementing various embodiments as previously described.
  • system and “component” are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution, examples of which are provided by the exemplary computing architecture 700 .
  • a component can be, but is not limited to being, a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a server and the server can be a component.
  • One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers. Further, components may be communicatively coupled to each other by various types of communications media to coordinate operations. The coordination may involve the uni-directional or bi-directional exchange of information. For instance, the components may communicate information in the form of signals communicated over the communications media. The information can be implemented as signals allocated to various signal lines. In such allocations, each message is a signal. Further embodiments, however, may alternatively employ data messages. Such data messages may be sent across various connections. Exemplary connections include parallel interfaces, serial interfaces, and bus interfaces.
  • the computing architecture 700 may comprise or be implemented as part of an electronic device.
  • an electronic device may include without limitation a mobile device, a personal digital assistant, a mobile computing device, a smart phone, a cellular telephone, a handset, a one-way pager, a two-way pager, a messaging device, a computer, a personal computer (PC), a desktop computer, a laptop computer, a notebook computer, a handheld computer, a tablet computer, a server, a server array or server farm, a web server, a network server, an Internet server, a work station, a mini-computer, a main frame computer, a supercomputer, a network appliance, a web appliance, a distributed computing system, multiprocessor systems, processor-based systems, consumer electronics, programmable consumer electronics, television, digital television, set top box, wireless access point, base station, subscriber station, mobile subscriber center, radio network controller, router, hub, gateway, bridge, switch, machine, or combination thereof.
  • the embodiments are not limited in this context
  • the computing architecture 700 includes various common computing elements, such as one or more processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, and so forth.
  • processors such as one or more processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, and so forth.
  • processors co-processors
  • memory units such as one or more processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, and so forth.
  • I/O multimedia input/output
  • the computing architecture 700 comprises a processing unit 704 , a system memory 706 and a system bus 708 .
  • the processing unit 704 can be any of various commercially available processors. Dual microprocessors and other multi-processor architectures may also be employed as the processing unit 704 .
  • the system bus 708 provides an interface for system components including, but not limited to, the system memory 706 to the processing unit 704 .
  • the system bus 708 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
  • the computing architecture 700 may comprise or implement various articles of manufacture.
  • An article of manufacture may comprise a computer-readable storage medium to store logic.
  • Examples of a computer-readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
  • Examples of logic may include executable computer program instructions implemented using any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object-oriented code, visual code, and the like.
  • the system memory 706 may include various types of computer-readable storage media in the form of one or more higher speed memory units, such as read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, or any other type of media suitable for storing information.
  • the system memory 706 can include non-volatile memory 710 and/or volatile memory 712 .
  • a basic input/output system (BIOS) can be stored in the non-volatile memory 710 .
  • the computer 702 may include various types of computer-readable storage media in the form of one or more lower speed memory units, including an internal hard disk drive (HDD) 714 , a magnetic floppy disk drive (FDD) 716 to read from or write to a removable magnetic disk 718 , and an optical disk drive 720 to read from or write to a removable optical disk 722 (e.g., a CD-ROM or DVD).
  • the HDD 714 , FDD 716 and optical disk drive 720 can be coupled to the system bus 708 by a HDD interface 724 , an FDD interface 726 and an optical drive interface 728 , respectively.
  • the HDD interface 724 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface technologies.
  • USB Universal Serial Bus
  • the drives and associated computer-readable media provide volatile and/or nonvolatile storage of data, data structures, computer-executable instructions, and so forth.
  • a number of program modules can be stored in the drives and memory units 710 , 712 , including an operating system 730 , one or more application programs 732 , other program modules 734 , and program data 736 .
  • the one or more application programs 732 , other program modules 734 , and program data 736 can include, for example, the decoder.
  • a user can enter commands and information into the computer 702 through one or more wire/wireless input devices, for example, a keyboard 738 and a pointing device, such as a mouse 740 .
  • Other input devices may include a microphone, an infra-red (IR) remote control, a joystick, a game pad, a stylus pen, touch screen, or the like.
  • IR infra-red
  • These and other input devices are often connected to the processing unit 704 through an input device interface 742 that is coupled to the system bus 708 , but can be connected by other interfaces such as a parallel port, IEEE 1394 serial port, a game port, a USB port, an IR interface, and so forth.
  • a monitor 744 or other type of display device is also connected to the system bus 708 via an interface, such as a video adaptor 746 .
  • a computer typically includes other peripheral output devices, such as speakers, printers, and so forth.
  • the computer 702 may operate in a networked environment using logical connections via wire and/or wireless communications to one or more remote computers, such as a remote computer 748 .
  • the remote computer 748 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network device, and typically includes many or all of the elements described relative to the computer 702 , although, for purposes of brevity, only a memory/storage device 750 is illustrated.
  • the logical connections depicted include wire/wireless connectivity to a local area network (LAN) 752 and/or larger networks, for example, a wide area network (WAN) 754 .
  • LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, for example, the Internet.
  • the computer 702 When used in a LAN networking environment, the computer 702 is connected to the LAN 752 through a wire and/or wireless communication network interface or adaptor 756 .
  • the adaptor 756 can facilitate wire and/or wireless communications to the LAN 752 , which may also include a wireless access point disposed thereon for communicating with the wireless functionality of the adaptor 756 .
  • the computer 702 can include a modem 758 , or is connected to a communications server on the WAN 754 , or has other means for establishing communications over the WAN 754 , such as by way of the Internet.
  • the modem 758 which can be internal or external and a wire and/or wireless device, connects to the system bus 708 via the input device interface 742 .
  • program modules depicted relative to the computer 702 can be stored in the remote memory/storage device 750 . It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
  • the computer 702 is operable to communicate with wire and wireless devices or entities using the IEEE 802 family of standards, such as wireless devices operatively disposed in wireless communication (e.g., IEEE 802.11 over-the-air modulation techniques) with, for example, a printer, scanner, desktop and/or portable computer, personal digital assistant (PDA), communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone.
  • PDA personal digital assistant
  • the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices.
  • Wi-Fi networks use radio technologies called IEEE 802.11x (a, b, g, n, etc.) to provide secure, reliable, fast wireless connectivity.
  • IEEE 802.11x a, b, g, n, etc.
  • a Wi-Fi network can be used to connect computers to each other, to the Internet, and to wire networks (which use IEEE 802.3-related media and functions).
  • Some embodiments may be described using the expression “one embodiment” or “an embodiment” along with their derivatives. These terms mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Further, some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.

Abstract

A method and system may detect an external display coupled to the mobile device. A digital input device may be presented on the integrated touch sensitive display of the mobile device to operate the mobile device. Multimedia content may be presented on the external display in response to the operation of the mobile device through the digital input device.

Description

    BACKGROUND
  • Mobile devices are becoming increasingly smaller over time in order to facilitate transport, storage and convenience for a user. Along with the smaller form factors, however, mobile devices typically implement integrated displays with a limited amount of display area. This makes it difficult for a user to view content presented on the smaller integrated displays. To solve this problem, a user may couple a mobile device to an external display, such as a liquid crystal display (LCD) monitor, via a wired or wireless connection. The external display may have a larger amount of display area relative to the integrated display of the mobile device. In this configuration, content presented on the smaller integrated display of a mobile device may be replicated to the larger external display to enhance viewing by a user.
  • Despite the larger image presented by the larger external display, the smaller integrated display of the mobile device typically continues to present the same view as that shown on the larger external display. This seems redundant given that one purpose of the larger external display is to become a primary viewing device for the mobile device. In some cases, the smaller integrated display may be turned off to conserve power to the mobile device. In both cases, however, the smaller integrated display remains an underutilized resource of the mobile device. It is with respect to these and other limitations that the present improvements are needed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an embodiment of an exemplary mobile system.
  • FIG. 2 illustrates an embodiment of a system within a mobile device.
  • FIGS. 3A-C illustrate embodiments of exemplary mobile systems.
  • FIG. 4 illustrates exemplary input devices presented on an integrated touch sensitive display of a mobile device.
  • FIG. 5 illustrates an embodiment of a first exemplary logic flow.
  • FIG. 6 illustrates an embodiment of a second exemplary logic flow.
  • FIG. 7 illustrates an embodiment of an exemplary computing architecture.
  • DETAILED DESCRIPTION
  • Embodiments are generally directed to techniques for managing multiple display devices coupled to a mobile device. Some embodiments are particularly directed to techniques for presenting different content on each display device coupled to a mobile device. In one embodiment, for example, a mobile device may detect an external display device. The mobile device may be coupled to the external display device via wired or wireless connections. The mobile device may comprise an integrated touch sensitive display. Furthermore, the mobile device may implement a display manager arranged to present different content on each of the display devices, rather than identical content as presented by conventional solutions.
  • In an embodiment, for example, the mobile device may detect an external display device. In an embodiment, a display manager may present a digital input device on an integrated touch sensitive display of a mobile device, while simultaneously presenting multimedia content from an application on an external display coupled to the mobile device. In an embodiment, the digital input device presented on an integrated touch sensitive display of a mobile device may operate the mobile device. A digital input device may comprise, for example, a software version of a keyboard, a pointing device, a touch pad, or any other type of physical input device. In this arrangement, the integrated touch sensitive display may be used to replicate functions of a physical input device, without necessarily needing to connect any physical input devices to the mobile device. In this manner, an integrated touch sensitive display for a mobile device may be utilized as a custom input device for the mobile device, rather than simply presenting identical content as that presented on an external display device. As such, an integrated touch sensitive display is converted from a wasted resource to a productive resource of a mobile device.
  • In an embodiment, multimedia may be presented on the external display. In an embodiment, the multimedia may be presented in response to the operation of the mobile device through the digital input device.
  • In some cases, a physical input device may be coupled to a mobile device. In such cases, the integrated touch sensitive display may be used to augment functions of the physical input device. For instance, a physical pointing device such as a mouse may be coupled to a mobile device, and used to select content presented on an external display. However, the external display may not provide sufficient resolution or granularity to specifically pin-point content presented on the external display. In this case, the integrated touch sensitive display may provide a more detailed version of an area around a pointer positioned on the external display, thereby allowing a user to select a more precise location within the detailed version by tapping on the integrated touch sensitive display at the desired location. Other use scenarios exist as well.
  • Various embodiments may comprise one or more elements. An element may comprise any structure arranged to perform certain operations. Each element may be implemented as hardware, software, or any combination thereof, as desired for a given set of design parameters or performance constraints. Although an embodiment may be described with a limited number of elements in a certain topology by way of example, the embodiment may include more or less elements in alternate topologies as desired for a given implementation. It is worthy to note that any reference to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
  • FIG. 1 illustrates a block diagram of one embodiment of a mobile system 100. In various embodiments, the mobile system 100 may comprise multiple nodes. A node generally may comprise any physical or logical entity for communicating information in the mobile system 100 and may be implemented as hardware, software, or any combination thereof, as desired for a given set of design parameters or performance constraints. An example of a node may comprise an electronic device, such as a mobile device or an external display. Another example of a node may comprise a part or component of an electronic device, such as a display adapter or wireless transceiver. Although FIG. 1 may show a limited number of nodes by way of example, it can be appreciated that more or less nodes may be employed for a given implementation.
  • In various embodiments, the mobile system 100 may comprise, or form part of a wired communications system, a wireless communications system, or a combination of both. For example, the mobile system 100 may include one or more nodes arranged to process and/or communicate information over one or more types of wired communication links. Examples of a wired communication link, may include, without limitation, a wire, cable, bus, printed circuit board (PCB), Ethernet connection, peer-to-peer (P2P) connection, backplane, switch fabric, semiconductor material, twisted-pair wire, co-axial cable, fiber optic connection, and so forth. The mobile system 100 also may include one or more nodes arranged to communicate information over one or more types of wireless communication links, such as shared media 160. Examples of a wireless communication link may include, without limitation, a radio channel, infrared channel, radio-frequency (RF) channel, a High-Definition Multimedia Interface (HDMI) channel, a digital visual interface (DVI) channel, a video graphics array (VGA) channel, a Wireless Fidelity (WiFi) channel, a Wireless Display (WiDi) channel, a portion of the RF spectrum, and/or one or more licensed or license-free frequency bands. In the latter case, the wireless nodes may include one or more wireless interface systems and/or components for wireless communication, such as one or more radios, transmitters, receivers, transceivers, chipsets, amplifiers, filters, control logic, network interface cards (NICs), antennas, antenna arrays, and so forth. Examples of an antenna may include, without limitation, an internal antenna, an omni-directional antenna, a monopole antenna, a dipole antenna, an end fed antenna, a circularly polarized antenna, a micro-strip antenna, a diversity antenna, a dual antenna, an antenna array, and so forth. In an embodiment, certain devices may include antenna arrays of multiple antennas to implement various adaptive antenna techniques and spatial diversity techniques.
  • In various embodiments, the mobile system 100 may comprise or be implemented as a mobile broadband communications system. Examples of mobile broadband communications systems include, without limitation, systems compliant with various Institute of Electrical and Electronics Engineers (IEEE) standards, such as the IEEE 802.11 standards for Wireless Local Area Networks (WLANs) and variants, the IEEE 802.16 standards for Wireless Metropolitan Area Networks (WMANs) and variants, and the IEEE 802.20 or Mobile Broadband Wireless Access (MBWA) standards and variants, among others. In an embodiment, for example, the mobile system 100 may be implemented in accordance with the Worldwide Interoperability for Microwave Access (WiMAX) or WiMAX II standard. WiMAX is a wireless broadband technology based on the IEEE 802.16 standard of which IEEE 802.16-2004 and the 802.16e amendment (802.16e Cor2/D3-2005) are Physical (PHY) layer specifications. WiMAX II is an advanced Fourth Generation (4G) system based on the IEEE 802.16m and IEEE 802.16j proposed standards for International Mobile Telecommunications (IMT) Advanced 4G series of standards.
  • In various embodiments, the mobile system 100 may comprise an external display device 120. As illustrated in FIG. 1, the external display device 120 may include, for example a wireless interface system 155 to allow the external display device 120 to communicate in the mobile system 100. In an embodiment, an external display device 120 may include wireless capabilities. The external display device 120 may include a display 165. The display 165 may include, but is not limited to, a plasma, a liquid crystal display (LCD), an organic light emitting diode (OLED) display and/or a red/green/blue (RGB) display, among others. In an embodiment, the display 165 on the external display device 120 may present multimedia content. The multimedia content may include, but is not limited to, text, audio, video, symbols, images and/or animation.
  • In various embodiments, the mobile system 100 may comprise a mobile device 110. In various embodiments, the mobile device 110 may include an integrated touch sensitive display 115, a processor 130, a memory unit 140, and a wireless interface system 150. The embodiments, however, are not limited to the elements shown in FIG. 1.
  • An integrated touch sensitive display 115 on the mobile device 110 may comprise any suitable display unit for displaying information on a device. The integrated touch sensitive display 115 may include an organic light emitting diode (OLED) display, a liquid crystal display (LCD), or other glass and/or plastic materials. In an embodiment, the integrated touch sensitive display 115 may provide high brightness and/or contrast and a wide aspect ratio. In an embodiment, the integrated touch sensitive display 115 may display text, symbols and/or images. In an embodiment, the integrated touch sensitive display 115 may include a monochromatic display. In an embodiment, the integrated touch sensitive display 115 may include a red/green/blue (RGB) display.
  • In an embodiment, the integrated touch sensitive display 115 may be suitable to present one or more graphical user interface (GUI) views generated by a GUI component of an application program or system program (e.g., an operating system). In an embodiment, the integrated touch sensitive display 115 may be implemented as a touch screen, touch panel, touch screen panel, and so forth. Touch screens may comprise display overlays which are implemented using one of several different techniques, such as pressure-sensitive (resistive) techniques, electrically-sensitive (capacitive) techniques, acoustically-sensitive (surface acoustic wave) techniques, photo-sensitive (infra-red) techniques, and so forth. In an embodiment, for example, integrated touch sensitive display 115 may be implemented by a liquid crystal display (LCD), plasma, projection screen or other type of suitable visual interface.
  • In various embodiments, the integrated touch sensitive display 115 may to determine when a user is touching the integrated touch sensitive display 115. In an embodiment, the integrated touch sensitive display 115 may receive information via a physical touch, such as, but not limited to, a touch from a virtual pen or a user's finger. The embodiments are not limited in this context.
  • As shown by the mobile device 110, the mobile device 110 may comprise a processor 130. The processor 130 may be implemented as any processor, such as a complex instruction set computer (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a processor implementing a combination of instruction sets, or other processor device. In one embodiment, for example, the processor 130 may be implemented as a general purpose processor, such as a processor made by Intel® Corporation, Santa Clara, Calif. The processor 130 may be implemented as a dedicated processor, such as a controller, microcontroller, embedded processor, a digital signal processor (DSP), a network processor, a media processor, an input/output (I/O) processor, and so forth. The embodiments are not limited in this context.
  • As further shown by the mobile device 110, the mobile device 110 may comprise a memory unit 140. The memory unit 140 may comprise any machine-readable or computer-readable media capable of storing data, including both volatile and non-volatile memory. For example, the memory unit 140 may include read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, or any other type of media suitable for storing information. It is worthy to note that some portion or all of the memory unit 140 may be included on the same integrated circuit as the processor 130, or alternatively some portion or all of the memory unit 140 may be disposed on an integrated circuit or other medium, for example a hard disk drive, that is external to the integrated circuit of the processor 130. In an embodiment, the memory unit 140 may include data and instructions to operate the processor. The embodiments are not limited in this context.
  • In various embodiments, the mobile device 110 and the external display device 120 may communicate information over shared media 160 via respective wireless interface systems 150, 155. The shared media 160 may comprise one or more allocations of RF spectrum. The allocations of RF spectrum may be contiguous or non-contiguous. In some embodiments, the wireless interface systems 150, 155 may communicate information over the shared media 160 using various multicarrier techniques utilized by, for example, WiFi, WiDi, general packet radio service (GPRS), long term evolution (LTE) technologies, WiMAX and/or WiMAX II systems. In general operation, the wireless interface system 150, 155 may communicate information using one or more communications channels. A communication channel may be a defined set of frequencies, time slots, codes, or combinations thereof.
  • The mobile device 110 may be coupled to the external display device 120 over the shared media 160. One or more logical or physical channels may be established to communicate information. The information may include media information and control information. Media information may refer to any data representing content meant for a user. Examples of content may include, for example, data from a voice conversation, videoconference, streaming video, electronic mail (“e-mail”) message, voice mail message (“voice message”), alphanumeric symbols, graphics, image, video, text and so forth. Data from a voice conversation may be, for example, speech information, silence periods, background noise, comfort noise, tones and so forth. Control information may refer to any data representing commands, instructions or control words meant for an automated system. For example, control information may be used to route media information through a system, or instruct a node to process the media information in a predetermined manner.
  • FIG. 2 illustrates a system within the mobile device 200. In an embodiment, a mobile device 200 may include a logic device 205. The logic device 205 may comprise various hardware elements, software elements, or a combination of both. Examples of hardware elements may include devices, components, processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), memory units, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. Examples of software elements may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints, as desired for a given implementation.
  • In an embodiment, the logic device 205 may be operative to detect that an external display is coupled to the mobile device. In an embodiment, the logic device 205 may be operative to detect that a physical external input device is coupled to the mobile device. In an embodiment, the external display and/or the physical external input device may be coupled via wires or wirelessly.
  • The logic device 205 may include a display manager 210. The display manager may be operative on the logic device. The display manager 210 may manage content presented on multiple displays. In an embodiment, the display manager 210 may manage content on the integrated touch sensitive display for the mobile device. In an embodiment, the display manager 210 may manage content on the external display for the mobile device.
  • In an embodiment, the display manager 210 may present multimedia content on an external display. As discussed above, multimedia content may include, but is not limited to, text, audio video, symbols, images and/or animation.
  • In an embodiment, the display manager may use a graphical user interface (GUI) 215 to present a digital input device on an integrated touch sensitive display for the mobile device 200. In an embodiment, the GUI component 215 may generate one or more GUI views on the integrated touch sensitive display. The GUI component 215 may provide one or more images on the integrated touch sensitive display of the mobile device with which a user can interact.
  • FIGS. A-C illustrate block diagrams of possible embodiments of a mobile system 300. FIG. 3A illustrates a block diagram of an embodiment of a mobile system 300. In the illustrated embodiment shown in FIG. 3, the mobile system 300 may include a mobile device 310 and an external display 320 for the mobile device 310. The external display 320 may be coupled to the mobile device 310 via a connection 360. In an embodiment, the connection 360 may be over the shared media 160, as shown in FIG. 1. In an embodiment, the connection 360 may be a wireless connection. In an embodiment, the mobile device 310 may connect 360 to the external display 320 using a Wireless Display (WiDi) channel. Examples of an external display 320 may include, but are not limited to, a television screen and/or a computer monitor.
  • A mobile device 310 may include, but is not limited to, a laptop, a notebook, a handheld computer, a handheld enclosure, a portable electronic device, a mobile internet device (MID), a tablet, a computing device with a touch screen, a slate and/or a personal digital assistant. The embodiments, however, are not limited to these examples. The mobile device 310 may include an integrated display 312. As discussed above, the integrated touch sensitive display 312 may be implemented as, but is not limited to, a touch screen, touch panel and/or touch screen panel.
  • When the mobile device 310 is coupled to the external display 320, the external display 320 may replicate the multimedia content currently displayed on the integrated touch sensitive display 312 of the mobile device 310. As discussed above, the multimedia content may include, but is not limited to, text, images, audio and video. Once the external display 320 and the mobile device 310 are coupled, the mobile device 310 may present a digital input device on the integrated touch sensitive display 312 while the external display 320 may continue to present the multimedia content.
  • In an embodiment, the combination of the external display 320 and the integrated touch sensitive display 312 of the mobile device 310 can mimic a desktop computer being docked with various input/output devices. For example, the external display 320 may function as the primary display while the integrated touch sensitive display 312 of the mobile device 310 may function as an input device.
  • An input device is a device which provides information to a processing device. Input devices may include, but are not limited to, a keyboard, a pointing device, a touch pad, a composite device, an imaging device and/or a video device. Further examples of input devices may include, but are not limited to, a mouse, a track pad, a digital camera, digital recorder, a webcam, a microphone, a scanner, a barcode reader, a pointing stick, a virtual pen, a joystick, a controller, a game controller, a remote, a computer keyboard, a trackball and/or any other type of physical input device. The embodiments, however, are not limited to these examples.
  • In an embodiment, the integrated touch sensitive display 312 of a mobile device 310 may display an input device. In an embodiment, the input device presented on the integrated touch sensitive display 312 may be referred to as a digital input device 370. The digital input device 370 may comprise hardware and/or software to replicate the functions of an external physical input device. In an embodiment, the digital input device 370 may mimic the functionality of the input device presented on the integrated touch sensitive display 312. A digital input device 370 presented on the integrated touch sensitive display 312 of the mobile device 310 may provide information to the external display 320. Using the digital input device 370 may reduce and/or eliminate the need to couple the mobile device 310 to external physical input devices.
  • For example, an integrated touch sensitive display 312 of the mobile device 310 may display a keyboard as a digital input device 370. A keyboard may be presented on the integrated touch sensitive display to operate the mobile device and the external display may present multimedia content in response to operation of the mobile device through the keyboard. When a user touches a letter on the keyboard presented on the integrated touch sensitive display 312 of the mobile device 310, the letter may appear on the external display 320 as if the user touched that letter on an external keyboard device.
  • For example, an integrated touch sensitive display 312 of the mobile device 310 may display a trackball as a digital input device 370. A trackball may be presented on the integrated touch sensitive display to operate the mobile device and the external display may present multimedia content in response to operation of the mobile device through the trackball. When a user touches the trackball presented on the integrated touch sensitive display 312, a pointer from the location of the trackball may appear on the external display 320.
  • For example, an integrated touch sensitive display 312 of the mobile device 310 may display a video remote with one or more of a stop, start, fast forward, rewind and/or play button. A video remote may be presented on the integrated touch sensitive display to operate the mobile device and the external display may present multimedia content in response to operation of the mobile device through the video remote. When a user touches a button on the video remote presented on the integrated touch sensitive display 312, the video remote may stop, start, fast forward, rewind and/or play on the external display 320 based on the user's touch.
  • The mobile device 310 may function as the input device presented on the integrated touch sensitive display. For example, the input device presented on the integrated touch sensitive display 312 of a mobile device 310 may be a keyboard. The keyboard presented on the integrated touch sensitive display of the mobile device may have the same functionality of an actual keyboard. For example, a user may touch the caps lock key and a letter key presented on the integrated touch sensitive display in order to capitalize a letter. Alternatively or in addition, a user may press multiple keys simultaneously to replicate a keyboard function, such as holding the shift key to capitalize a letter.
  • Alternatively and or in addition, the mobile device 310 may mimic the functionality of an external physical input device via one or more visual images, sounds and/or vibrations. The mobile device 310 may provide sensory output such as, but not limited to, output for a user's sense of sight, hearing and touch. The sensory output from the integrated touch sensitive display 312 of the mobile device 310 may assist the user in using the digital input device 370.
  • In an embodiment, the integrated touch sensitive display 312 of the mobile device 310 may mimic the functionality of an external physical input device via a visual depiction. For example, the integrated touch sensitive display 312 of the mobile device 310 may display a keyboard as a digital input device 370. The integrated touch sensitive display 312 may mimic the keyboard via the visual depiction of the various buttons of a full sized keyboard device. A user may touch a key on the integrated touch sensitive display 312 to mimic the pressing of one or more keys. In an embodiment, the integrated touch sensitive display 312 may mimic a keyboard by displaying a depressed key when the presented key is touched by a user. For example, a user may touch the letter “u” and the letter “u” key presented on the integrated touch sensitive display 312 may show a depressed “u” key. In an embodiment, the integrated touch sensitive display 312 of the mobile device 310 may display a mouse with a scroll wheel. The integrated touch sensitive display 312 of the mobile device 310 may show a visual image of where the mouse previously moved on the integrated touch sensitive display 312. Alternatively or in addition, the integrated touch sensitive display 312 may depict a moving scroll wheel when a user touches the displayed scroll wheel. The examples are not limited to the embodiments described.
  • In an embodiment, the mobile device 310 may mimic the functionality of an external physical input device via one or more sounds. For example, if the digital input device 370 presented is a keyboard, the mobile device 310 may provide a clicking sound when a user touches a key presented on the integrated touch sensitive display 312. If the digital input device 370 presented is a mouse, the mobile device 310 may provide a clicking sound when a user touches the left and/or right buttons depicted on the displayed mouse. Alternatively or in addition, the mobile device 310 may provide a noise when a user moves a mouse around the integrated touch sensitive display 312. If the digital input device 370 presented is a barcode reader, the mobile device 310 may provide a beeping sound when a barcode is read. The examples are not limited to the embodiments described.
  • Alternatively or in addition, the integrated touch sensitive display 312 of the mobile device 310 may provide a vibration or sensation to a user when a user touches a part of the presented digital input device 370. In an embodiment, the mobile device 310 may mimic the functionality of the input device via a touch or tactile sensation. In an embodiment, the mobile device may contain a motor and or other tactile technology in order to provide the sensation to a user. For example, if the digital input device 370 presented is a track pad, the mobile device 310 may provide the user with a sensation of the ball of a tack pad rolling across the integrated touch sensitive display 212. If the input device presented is a mouse, the mobile device 310 may provide a vibration when the left and/or right buttons presented on the integrated touch sensitive display 312 are touched by a user. Alternatively or in addition, the mobile device 310 may provide a different vibration when a scrolling wheel of the mouse is touched by a user. If the input device presented is a keyboard, the mobile device 310 may provide a vibration when a key on the integrated touch sensitive display 312 is touched. The examples are not limited to the embodiments described.
  • Using the integrated touch sensitive display 312 to replicate functions of an external physical input device may eliminate the need to connect one or more physical input devices to the mobile device 310. The integrated touch sensitive display 312 for a mobile device 310 may be utilized as a custom input device for the mobile device 310 and may be a productive resource of a mobile device 310.
  • FIG. 3B illustrates a block diagram of an embodiment of a mobile system 300. In an embodiment, a mobile system 300 may include multiple external displays 320A, 320B, 320C coupled to a mobile device 310. In an embodiment, the external display 320A, 320B, 320C may be coupled to the mobile device 310 via a wired or wireless connection. In an embodiment, one or more of the multiple external displays 320A, 320B, 320C may be coupled to the mobile device via a Wireless Display (WiDi) channel.
  • In an embodiment, one or more of external displays may present multimedia content. In an embodiment, one or more of the external displays 320A, 320B, 320C may present the same multimedia content simultaneously. For example, various external displays 320A, 320B, 320C may present the same multimedia content simultaneously so that users in a large ballroom may view the content, such as a presentation, from wherever they are seated. Alternatively or in addition, one or more external display 320A, 320B, 320C may present at least a portion of the multimedia content. For example, multiple displays may be used together to form one large display presenting the multimedia content. The examples are not limited to the embodiments described.
  • FIG. 3C illustrates a block diagram of an embodiment of a mobile system 300. In an embodiment, an external input 380 may be coupled to a mobile device 310. In an embodiment, the external input 380 may be a physical input device. In an embodiment, the external input 380 may be coupled to the mobile device via a wired or wireless connection. In an embodiment, the external input 380 may be wirelessly coupled to the mobile device using the connection that coupled the mobile device 310 to the external display 320.
  • In an embodiment, the integrated touch sensitive display 312 on the mobile device 310 may be used to augment and/or enhance functions of the external input 380. For example, an external input 380 may be a physical pointing device such as a virtual pen. The virtual pen 380 may be coupled to a mobile device 310 and may be used to select content presented on an external display 320. However, the external display 320 may not provide sufficient resolution and/or granularity to specifically pin-point content presented on the external display 320. The digital input device 370 on the integrated touch sensitive display 312 of the mobile device 310 may provide a more detailed version of an area around a pointer positioned on the external display 370. This may allow a user to select a more precise location within the detailed version by tapping on the input device 370 on the integrated touch sensitive display 312 at the desired location.
  • In an embodiment, the integrated touch sensitive display 312 on the mobile device 310 may be used as a supplemental device along with the external input 380. For example, a computer workstation often has multiple external devices such as a keyboard and a mouse. In an embodiment, the mobile system 300 may have a physical keyboard as an external input 280 and the input device 370 on the integrated touch sensitive display 312 of the mobile device 310 may be a mouse. By having the mobile device 310 act as an input device, a computer workstation may be easily simulated without the need for multiple external devices. The examples are not limited to the embodiments described.
  • FIG. 4 illustrates exemplary digital input devices 425A, 425B, 425C presented on an integrated touch sensitive display 410 of a mobile device. As shown in FIG. 4, the integrated touch sensitive display 410 on the mobile device 405 may alternate between various digital input devices 425A, 425B, 425C and the multimedia content 415 while coupled to an external display of the mobile device 405.
  • In an embodiment, the mobile device 405 may present a keyboard 425A as the digital input device 425 on the integrated touch sensitive display 410. In an embodiment, the integrated touch sensitive display 410 of the mobile device 405 may automatically present a keyboard 425A as the digital input device 425 when the mobile device 405 is coupled to an external display.
  • In an embodiment, the integrated touch sensitive display 410 of the mobile device 405 may wait to receive information about the digital input device 425. In an embodiment, the information may cause the integrated touch sensitive display 410 on the mobile device 405 to display a digital input device 425. The information may cause the mobile device to function as the digital input device 425. For example, a user may select one or more digital input devices 425A, 425B, 425C to be presented on the integrated touch sensitive display 410 from a set, list and/or group of input devices. For example, the mobile device 405 may receive information that a user selected a keyboard 425A to be presented as the digital input device 425. As a result, the mobile device 405 may present a keyboard 425A as the digital input device 425 on the integrated touch sensitive display 410 and the mobile device 405 may function as a keyboard.
  • In an embodiment, a user may want to change the digital input device 425 presented on the integrated touch sensitive display 410 of the mobile device 405. In an embodiment, the integrated touch sensitive display 410 on the mobile device 405 may switch from displaying a first digital input device 425 to displaying a second digital input device 425. For example, a user may want to change from the keyboard 425A as the digital input device 425 to a track pad 425B as the digital input device 425. In an embodiment, the digital input device 425 displayed on the integrated touch sensitive display 410 may switch because an external device has been added to the mobile system.
  • Alternatively or in addition, a user may want to change to include multiple digital input devices 425 on the integrated touch sensitive display 410 of the mobile device 405. For example, a user may want to change to an integrated touch sensitive display 410 including a keyboard with a track pad 425C as the digital input device 425.
  • In an embodiment, an integrated touch sensitive display 410 on the mobile device 405 may switch back to being the primary display and may present at least a portion of the multimedia content 415 which is presented on the external display. When the integrated touch sensitive display 410 on the mobile device 405 switches to display the multimedia content 415, the integrated touch sensitive display 410 and the external display may have the same display.
  • Alternatively and/or in addition, an integrated touch sensitive display 410 on the mobile device 405 may present at least a portion of the multimedia content 415 which is presented on the external display. For example, the digital input device 425 may present a more detailed version of a portion of the media content 415 displayed on the external display screen.
  • FIG. 5 illustrates an embodiment of a first exemplary logic flow. Logic flow 500 may be representative of the operations executed by one or more embodiments described herein. It should be noted that the methods described herein do not have to be executed in the order described, or in any particular order. Moreover, various activities described with respect to the methods identified herein can be executed in serial or parallel fashion.
  • As shown in logic flow 500, an external display coupled to a mobile device may be detected 505. The mobile device may detect 505 that one or more external displays are coupled via a logic device 205, shown in FIG. 2. The logic device may detect that the mobile device is coupled to an external display via a wired connection. Examples of a wired connection include, but are not limited to, cables and/or cords. In an embodiment, the logic device may detect that the mobile device is coupled to an external display via a wireless connection. Examples of a wireless connection include, but are not limited to, Wireless Fidelity (WiFi) and/or Wireless Display (WiDi). The embodiments, however, are not limited to these examples.
  • Alternatively or in addition, the display manager 210, as shown in FIG. 2, in the mobile device may manage 507 the multiple display devices. In an embodiment, the mobile device may manage 507 an integrated touch sensitive display of the mobile device. In an embodiment, the mobile device may manage 507 one or more external displays.
  • In an embodiment, a digital input device may be presented 510 on the integrated touch sensitive display of the mobile device to operate the mobile device. For example, the integrated touch sensitive display of the mobile device may be a touch screen. A keyboard may be presented as the digital input device. The keyboard displayed on the integrated touch sensitive display may be used to operate the mobile device. The embodiments, however, are not limited to this example.
  • In an embodiment, multimedia content may be presented 515 to the external display of the mobile device in response to the operation of the mobile device through the digital input device. The multimedia content may be an application executed, or running on, the mobile device. As discussed above, multimedia content may include, but is not limited to, text, audio, video and images.
  • In an embodiment, the multimedia content may be presented on the integrated touch sensitive display of the mobile device prior to the mobile device being coupled to the external display. After the mobile device and the external display are coupled, the external display may present 515 the multimedia content. By presenting the multimedia content, the external display may become the primary display.
  • In an embodiment, the digital input device presented 510 on the integrated touch sensitive display of the mobile device may be different than the media content presented 515 on the external display. In other words, rather than having both the integrated touch sensitive display on the mobile device and the external display present the same multimedia content, once the mobile device detects 505 that that the mobile device is coupled to an external display device, the integrated touch sensitive display of the mobile device may present 510 a digital input device used to operate the mobile device.
  • In an embodiment, the mobile device may detect 505 multiple external displays coupled to the mobile device. In an embodiment, one or more of the external displays may present 515 the multimedia content in response to operation of the mobile device through the digital input device. For example, the external display may be a monitor and multiple monitors may present 515 the multimedia content. In an embodiment, one or more external displays may present 515 the same multimedia content simultaneously. For example, in a large ballroom, users may view various screens presenting 515 the same content, such as a movie.
  • FIG. 6 illustrates an embodiment of a second exemplary logic flow. Logic flow 600 may be representative of the operations executed by one or more embodiments described herein. It should be noted that the methods described herein do not have to be executed in the order described, or in any particular order. Moreover, various activities described with respect to the methods identified herein can be executed in serial or parallel fashion.
  • In an embodiment, once an external display coupled to a mobile device is detected 605, the mobile device may receive 615 information about a digital input device. In an embodiment, the mobile device may determine whether it received 615 information about a digital input device. In an embodiment, a time threshold may be used. The time threshold may be 5 seconds, 30 seconds, 1 minute, 2 minutes and or other time values. If no information is received during the threshold, then the mobile device may determine that no information was received 615.
  • In an embodiment, if the mobile device did not receive 615 information about a digital input device, then the integrated touch sensitive display may automatically present 620 a digital input device on the integrated touch sensitive display of the mobile device. For example, the integrated touch sensitive display of the mobile device may automatically present 620 a keyboard as the digital input device. Alternatively or in addition, the integrated touch sensitive display of the mobile device may automatically present 620 a mouse as the digital input device. In an embodiment, the digital input device that is automatically presented 620 may be previously determined. In an embodiment, the digital input device that is automatically presented 620 may be randomly chosen from the one or more digital input devices that can be presented on the user interface device.
  • In an embodiment, a user may select a digital input device to present 625 on the integrated touch sensitive display of the mobile device. In an embodiment, after the mobile device detects 605 that it is coupled to an external display, the mobile device may receive 615 information about a digital input device and may present 625 the digital input device on the integrated touch sensitive display of the mobile device based on the received information.
  • In an embodiment, a user may provide the information received 615 by the mobile device about a digital input device by selecting a digital input device. For example, a user may touch a certain place on the integrated touch sensitive display which may cause the integrated touch sensitive display to display various digital input devices. In an embodiment, the integrated touch sensitive display may present a set, list and/or group of digital input devices. In an embodiment, in addition to the set, list or group of one or more digital input devices, the integrated touch sensitive display may also list an option of displaying all or a portion of the multimedia content.
  • In an embodiment, a user may provide the information received 615 by the mobile device by simply touching the integrated touch sensitive display. In an embodiment, the user may touch the integrated touch sensitive display of the mobile device with his/her finger to make a selection from a group, list and/or set. In an embodiment, the user may use a stylus or virtual pen to make a selection. The examples are not limited to the embodiments described.
  • In an embodiment, the mobile device may receive 615 information about a digital input device by receiving information that an external physical device was added to the mobile system. As a result of an added external physical device, the integrated touch sensitive display on the mobile system may present a digital input device which provides a more detailed version of a part of the external display screen. Alternatively and or in addition, as a result of an added external physical device, the integrated touch sensitive display on the mobile system may present a different digital input device. In an embodiment, the digital input device may complement the external physical device. For example, when a keyboard is coupled to a mobile device, the integrated touch sensitive display may present a mouse.
  • In an embodiment, a user may switch the digital input device presented on the integrated touch sensitive display of the mobile device. A user may hot swap the digital input device presented on the integrated touch sensitive display of the mobile device. In an embodiment, the mobile device may receive 615 information about a digital input device after a digital input device is automatically presented 620 on the integrated touch sensitive display of the mobile device. For example, a full size keyboard may be presented on the display. Information may be received 615 and a track pad may be hot swapped so that the tack pad is displayed on the integrated touch sensitive display. Additionally and/or alternatively, a mobile device may receive 615 information about a first digital input device and then receive 615 information about a second digital input device.
  • In an embodiment, the mobile device may change the integrated touch sensitive display to present 625 the input device based on the received information. For example, the integrated touch sensitive display of the mobile device may automatically present 620 a keyboard when the mobile device is coupled to the external display. After a user used the mobile device as a keyboard, a user may want the integrated touch sensitive display to present 625 a track pad as the digital input device. In an embodiment, the mobile device may receive information 615 about a digital input device. For example, the mobile device may receive 615 information that a user selected a track pad as the digital input device. The mobile device may present 625 the tack pad on the integrated touch sensitive display.
  • Alternatively or in addition to, the mobile device may receive 615 information that the integrated touch sensitive display should present all or a portion of the multimedia content. The mobile device may present all or a portion of the multimedia content on the integrated touch sensitive display. In an embodiment, an enlarged or more detailed portion of the multimedia content may be presented on the integrated touch sensitive display as the digital input device.
  • FIG. 7 illustrates an embodiment of an exemplary computing architecture 700 suitable for implementing various embodiments as previously described. As used in this application, the terms “system” and “component” are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution, examples of which are provided by the exemplary computing architecture 700. For example, a component can be, but is not limited to being, a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers. Further, components may be communicatively coupled to each other by various types of communications media to coordinate operations. The coordination may involve the uni-directional or bi-directional exchange of information. For instance, the components may communicate information in the form of signals communicated over the communications media. The information can be implemented as signals allocated to various signal lines. In such allocations, each message is a signal. Further embodiments, however, may alternatively employ data messages. Such data messages may be sent across various connections. Exemplary connections include parallel interfaces, serial interfaces, and bus interfaces.
  • In one embodiment, the computing architecture 700 may comprise or be implemented as part of an electronic device. Examples of an electronic device may include without limitation a mobile device, a personal digital assistant, a mobile computing device, a smart phone, a cellular telephone, a handset, a one-way pager, a two-way pager, a messaging device, a computer, a personal computer (PC), a desktop computer, a laptop computer, a notebook computer, a handheld computer, a tablet computer, a server, a server array or server farm, a web server, a network server, an Internet server, a work station, a mini-computer, a main frame computer, a supercomputer, a network appliance, a web appliance, a distributed computing system, multiprocessor systems, processor-based systems, consumer electronics, programmable consumer electronics, television, digital television, set top box, wireless access point, base station, subscriber station, mobile subscriber center, radio network controller, router, hub, gateway, bridge, switch, machine, or combination thereof. The embodiments are not limited in this context.
  • The computing architecture 700 includes various common computing elements, such as one or more processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, and so forth. The embodiments, however, are not limited to implementation by the computing architecture 700.
  • As shown in FIG. 7, the computing architecture 700 comprises a processing unit 704, a system memory 706 and a system bus 708. The processing unit 704 can be any of various commercially available processors. Dual microprocessors and other multi-processor architectures may also be employed as the processing unit 704. The system bus 708 provides an interface for system components including, but not limited to, the system memory 706 to the processing unit 704. The system bus 708 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
  • The computing architecture 700 may comprise or implement various articles of manufacture. An article of manufacture may comprise a computer-readable storage medium to store logic. Examples of a computer-readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of logic may include executable computer program instructions implemented using any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object-oriented code, visual code, and the like.
  • The system memory 706 may include various types of computer-readable storage media in the form of one or more higher speed memory units, such as read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, or any other type of media suitable for storing information. In the illustrated embodiment shown in FIG. 7, the system memory 706 can include non-volatile memory 710 and/or volatile memory 712. A basic input/output system (BIOS) can be stored in the non-volatile memory 710.
  • The computer 702 may include various types of computer-readable storage media in the form of one or more lower speed memory units, including an internal hard disk drive (HDD) 714, a magnetic floppy disk drive (FDD) 716 to read from or write to a removable magnetic disk 718, and an optical disk drive 720 to read from or write to a removable optical disk 722 (e.g., a CD-ROM or DVD). The HDD 714, FDD 716 and optical disk drive 720 can be coupled to the system bus 708 by a HDD interface 724, an FDD interface 726 and an optical drive interface 728, respectively. The HDD interface 724 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface technologies.
  • The drives and associated computer-readable media provide volatile and/or nonvolatile storage of data, data structures, computer-executable instructions, and so forth. For example, a number of program modules can be stored in the drives and memory units 710, 712, including an operating system 730, one or more application programs 732, other program modules 734, and program data 736. The one or more application programs 732, other program modules 734, and program data 736 can include, for example, the decoder.
  • A user can enter commands and information into the computer 702 through one or more wire/wireless input devices, for example, a keyboard 738 and a pointing device, such as a mouse 740. Other input devices may include a microphone, an infra-red (IR) remote control, a joystick, a game pad, a stylus pen, touch screen, or the like. These and other input devices are often connected to the processing unit 704 through an input device interface 742 that is coupled to the system bus 708, but can be connected by other interfaces such as a parallel port, IEEE 1394 serial port, a game port, a USB port, an IR interface, and so forth.
  • A monitor 744 or other type of display device is also connected to the system bus 708 via an interface, such as a video adaptor 746. In addition to the monitor 744, a computer typically includes other peripheral output devices, such as speakers, printers, and so forth.
  • The computer 702 may operate in a networked environment using logical connections via wire and/or wireless communications to one or more remote computers, such as a remote computer 748. The remote computer 748 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network device, and typically includes many or all of the elements described relative to the computer 702, although, for purposes of brevity, only a memory/storage device 750 is illustrated. The logical connections depicted include wire/wireless connectivity to a local area network (LAN) 752 and/or larger networks, for example, a wide area network (WAN) 754. Such LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, for example, the Internet.
  • When used in a LAN networking environment, the computer 702 is connected to the LAN 752 through a wire and/or wireless communication network interface or adaptor 756. The adaptor 756 can facilitate wire and/or wireless communications to the LAN 752, which may also include a wireless access point disposed thereon for communicating with the wireless functionality of the adaptor 756.
  • When used in a WAN networking environment, the computer 702 can include a modem 758, or is connected to a communications server on the WAN 754, or has other means for establishing communications over the WAN 754, such as by way of the Internet. The modem 758, which can be internal or external and a wire and/or wireless device, connects to the system bus 708 via the input device interface 742. In a networked environment, program modules depicted relative to the computer 702, or portions thereof, can be stored in the remote memory/storage device 750. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
  • The computer 702 is operable to communicate with wire and wireless devices or entities using the IEEE 802 family of standards, such as wireless devices operatively disposed in wireless communication (e.g., IEEE 802.11 over-the-air modulation techniques) with, for example, a printer, scanner, desktop and/or portable computer, personal digital assistant (PDA), communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone. This includes at least Wi-Fi (or Wireless Fidelity), WiMax, and Bluetooth™ wireless technologies. Thus, the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices. Wi-Fi networks use radio technologies called IEEE 802.11x (a, b, g, n, etc.) to provide secure, reliable, fast wireless connectivity. A Wi-Fi network can be used to connect computers to each other, to the Internet, and to wire networks (which use IEEE 802.3-related media and functions).
  • Some embodiments may be described using the expression “one embodiment” or “an embodiment” along with their derivatives. These terms mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Further, some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
  • It is emphasized that the Abstract of the Disclosure is provided to allow a reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein,” respectively. Moreover, the terms “first,” “second,” “third,” and so forth, are used merely as labels, and are not intended to impose numerical requirements on their objects.
  • What has been described above includes examples of the disclosed architecture. It is, of course, not possible to describe every conceivable combination of components and/or methodologies, but one of ordinary skill in the art may recognize that many further combinations and permutations are possible. Accordingly, the novel architecture is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims.

Claims (20)

1. An article comprising a machine-readable storage medium containing instructions that when executed enable a system to:
detect an external display coupled to a mobile device;
present a digital input device on an integrated touch sensitive display of the mobile device to operate the mobile device; and
present multimedia content on the external display in response to the operation of the mobile device through the digital input device.
2. The article of claim 1, said instructions that when executed enable a system to detect an external display comprise instructions that when executed enable a system to detect that the external display is coupled wirelessly to the mobile device.
3. The article of claim 1, said instructions that when executed enable a system to detect an external display comprise instructions that when executed enable a system to detect that the external display is coupled to the mobile device using one or more of: a High-Definition Multimedia Interface (HDMI) channel, a digital visual interface (DVI) channel, a video graphics array (VGA) channel, and Wireless Display (WiDi) channel.
4. The article of claim 1, said instructions that when executed enable a system to present a digital input device on an integrated touch sensitive display of the mobile device comprise instructions that when executed enable a system to present a keyboard as the digital input device on the integrated touch sensitive display of the mobile device.
5. The article of claim 1, said instructions that when executed enable a system to present a digital input device on an integrated touch sensitive display of the mobile device comprise instructions that when executed enable a system to present a track pad as the digital input device on the integrated touch sensitive display of the mobile device.
6. The article of claim 1, said instructions that when executed enable a system to present a digital input device on an integrated touch sensitive display of the mobile device comprise instructions that when executed enable a system to present an enlarged portion of the multimedia content as the digital input device on the integrated touch sensitive display of the mobile device.
7. The article of claim 1, said instructions that when executed enable a system to present a digital input device on an integrated touch sensitive display of the mobile device comprise instructions that when executed enable a system to present two or more digital input devices simultaneously on the integrated touch sensitive display of the mobile device.
8. The article of claim 1, said instructions that when executed enable a system to present a digital input device on an integrated touch sensitive display of the mobile device comprise instructions that when executed enable a system to:
receive information about a digital input device; and
present the digital input device on the integrated touch sensitive display based on the information.
9. The article of claim 1, said instructions that when executed enable a system to present a digital input device on an integrated touch sensitive display of the mobile device comprise instructions that when executed enable a system to:
present a keyboard as the digital input device on the integrated touch sensitive display;
receive information about a second digital input device; and
present a track pad on the integrated touch sensitive display based on the received information.
10. A method comprising:
detecting an external display coupled to a mobile device;
presenting a digital input device on an integrated touch sensitive display of the mobile device to operate the mobile device; and
presenting multimedia content on the external display in response to the operation of the mobile device through the digital input device.
11. The method of claim 10, the detecting an external display coupled to a mobile device comprises detecting that the external display device is wirelessly coupled to the mobile device.
12. The method of claim 10, the detecting an external display coupled to a mobile device comprises detecting one or more of: a High-Definition Multimedia Interface (HDMI) channel, a digital visual interface (DVI) channel, a video graphics array (VGA) channel, and Wireless Display (WiDi) channel between the mobile device and the external display.
13. The method of claim 10, the presenting multimedia content on the external display comprises:
presenting at least a portion of the multimedia content on the integrated touch sensitive display of the mobile device.
14. The method of claim 10, the presenting a digital input device on an integrated touch sensitive display of the mobile device comprises:
receiving information about a digital input device; and
presenting the digital input device on the integrated touch sensitive display based on the information.
15. The method of claim 10, the presenting a digital input device on an integrated touch sensitive display of the mobile device comprises:
presenting a keyboard as the digital input device on the integrated touch sensitive display;
receiving information about a second digital input device; and
presenting a track pad on the integrated touch sensitive display based on the received information.
16. A mobile device, comprising:
a logic device; and
a display manager operative on the logic device to manage content presented on multiple displays, the display manager to use a graphical user interface (GUI) to present a digital input device on an integrated touch sensitive display for a mobile device to operate the mobile device, and present multimedia content on an external display in response to the operation of the mobile device through the digital input device.
17. The device of claim 16, the logic device operative to detect that a physical external input device is coupled to the mobile device.
18. The device of claim 16, comprising:
a wireless interface system operative to wirelessly couple to the external display.
19. The device of claim 16, the integrated touch sensitive display operative to display one or more of a keyboard and/or a track pad as the digital input device.
20. The device claim 16, comprising:
a digital display.
US13/189,706 2011-07-25 2011-07-25 Techniques to display an input device on a mobile device Abandoned US20130027315A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US13/189,706 US20130027315A1 (en) 2011-07-25 2011-07-25 Techniques to display an input device on a mobile device
CN201710941245.8A CN107678675B (en) 2011-07-25 2012-07-25 Method and apparatus for displaying input device on mobile device
PCT/US2012/048056 WO2013016383A1 (en) 2011-07-25 2012-07-25 Techniques for displaying an input device on a mobile device
CN201280036752.2A CN103703452A (en) 2011-07-25 2012-07-25 Techniques for displaying an input device on a mobile device
US15/854,108 US10956020B2 (en) 2011-07-25 2017-12-26 Techniques to display an input device on a mobile device
US17/206,855 US11237721B2 (en) 2011-07-25 2021-03-19 Techniques to display an input device on a mobile device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/189,706 US20130027315A1 (en) 2011-07-25 2011-07-25 Techniques to display an input device on a mobile device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/854,108 Continuation US10956020B2 (en) 2011-07-25 2017-12-26 Techniques to display an input device on a mobile device

Publications (1)

Publication Number Publication Date
US20130027315A1 true US20130027315A1 (en) 2013-01-31

Family

ID=47596812

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/189,706 Abandoned US20130027315A1 (en) 2011-07-25 2011-07-25 Techniques to display an input device on a mobile device
US15/854,108 Active US10956020B2 (en) 2011-07-25 2017-12-26 Techniques to display an input device on a mobile device
US17/206,855 Active US11237721B2 (en) 2011-07-25 2021-03-19 Techniques to display an input device on a mobile device

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/854,108 Active US10956020B2 (en) 2011-07-25 2017-12-26 Techniques to display an input device on a mobile device
US17/206,855 Active US11237721B2 (en) 2011-07-25 2021-03-19 Techniques to display an input device on a mobile device

Country Status (3)

Country Link
US (3) US20130027315A1 (en)
CN (2) CN103703452A (en)
WO (1) WO2013016383A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130160095A1 (en) * 2011-12-14 2013-06-20 Nokia Corporation Method and apparatus for presenting a challenge response input mechanism
US20130246905A1 (en) * 2012-03-19 2013-09-19 Kabushiki Kaisha Toshiba Information generator, information output device, and recording medium
US20140049493A1 (en) * 2012-08-17 2014-02-20 Konica Minolta, Inc. Information device, and computer-readable storage medium for computer program
US20140145969A1 (en) * 2012-11-29 2014-05-29 Research In Motion Limited System and method for graphic object management in a large-display area computing device
US20140210713A1 (en) * 2013-01-31 2014-07-31 Samsung Electronics Co., Ltd. Method for controlling display of pointer and displaying the pointer, and apparatus thereof
US20140247207A1 (en) * 2013-03-04 2014-09-04 Microsoft Corporation Causing Specific Location of an Object Provided to a Device
US20140320423A1 (en) * 2013-04-29 2014-10-30 Srikanth Kambhatla Supporting Keyboard and Mouse Over Embedded DisplayPort Without Using a Universal Serial Bus
US20150230038A1 (en) * 2014-02-07 2015-08-13 Boe Technology Group Co., Ltd. Information display method, information display device, and display apparatus
CN105025124A (en) * 2015-06-25 2015-11-04 尚诚德 Intelligent mobile phone capable of matching mobile phone computer
CN105027128A (en) * 2013-02-28 2015-11-04 通用电气公司 Handheld medical imaging apparatus with cursor pointer control
US20150355611A1 (en) * 2014-06-06 2015-12-10 Honeywell International Inc. Apparatus and method for combining visualization and interaction in industrial operator consoles
WO2015199851A1 (en) * 2014-06-27 2015-12-30 Google Inc. Streaming display data from a mobile device using backscatter communications
US20160085396A1 (en) * 2014-09-24 2016-03-24 Microsoft Corporation Interactive text preview
CN106062696A (en) * 2014-03-31 2016-10-26 惠普发展公司,有限责任合伙企业 Three-part gesture
US20160343350A1 (en) * 2015-05-19 2016-11-24 Microsoft Technology Licensing, Llc Gesture for task transfer
US20180007104A1 (en) 2014-09-24 2018-01-04 Microsoft Corporation Presentation of computing environment on multiple devices
EP3486764A1 (en) * 2017-11-21 2019-05-22 Samsung Electronics Co., Ltd. Method for configuring input interface and electronic device using same
US10448111B2 (en) 2014-09-24 2019-10-15 Microsoft Technology Licensing, Llc Content projection
US10635296B2 (en) 2014-09-24 2020-04-28 Microsoft Technology Licensing, Llc Partitioned application presentation across devices
US10824531B2 (en) 2014-09-24 2020-11-03 Microsoft Technology Licensing, Llc Lending target device resources to host device computing environment
EP3937149A1 (en) * 2020-07-08 2022-01-12 Hongfujin Precision Electronics (Tianjin) Co., Ltd. Small and portable input device with functions for starting and controlling document display

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017119125A1 (en) * 2017-08-22 2019-02-28 Roccat GmbH Apparatus and method for generating moving light effects
WO2022162436A1 (en) * 2021-02-01 2022-08-04 Mokutu Emmanuel Method for replicating an input device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060090122A1 (en) * 2004-10-21 2006-04-27 Nokia Corporation Group editing of media content stored on wireless portable devices
US20100138780A1 (en) * 2008-05-20 2010-06-03 Adam Marano Methods and systems for using external display devices with a mobile computing device
US20100245267A1 (en) * 2009-03-31 2010-09-30 Lg Electronics Inc. Mobile terminal and method of controlling the same
US20100257473A1 (en) * 2009-04-01 2010-10-07 Samsung Electronics Co., Ltd. Method for providing gui and multimedia device using the same
US7844301B2 (en) * 2005-10-14 2010-11-30 Lg Electronics Inc. Method for displaying multimedia contents and mobile communications terminal capable of implementing the same
US8458255B2 (en) * 2008-06-20 2013-06-04 Sharp Kabushiki Kaisha Data output device, data providing device, data output system, data output device control method, and data providing device control method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060079214A1 (en) * 2004-10-12 2006-04-13 Nokia Corporation Method and apparatus for showing wireless mobile device data content on an external viewer
KR100755851B1 (en) * 2005-10-14 2007-09-07 엘지전자 주식회사 Method for playing multimedia contents, mobile terminal capable of implementing the same, and cradle for the mobile terminal
JP2008211379A (en) * 2007-02-23 2008-09-11 Fujitsu Ltd Display control program and portable terminal device
TWI420344B (en) * 2007-12-31 2013-12-21 Htc Corp Method for switching touch keyboard and handheld electronic device and storage medium using the same
CN101500036A (en) * 2009-01-06 2009-08-05 深圳华为通信技术有限公司 Method, mobile terminal and projector for controlling display content of projector
US20100216508A1 (en) * 2009-02-23 2010-08-26 Augusta Technology, Inc. Systems and Methods for Driving an External Display Device Using a Mobile Phone Device
KR101695810B1 (en) * 2010-05-07 2017-01-13 엘지전자 주식회사 Mobile terminal and method for controlling thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060090122A1 (en) * 2004-10-21 2006-04-27 Nokia Corporation Group editing of media content stored on wireless portable devices
US7844301B2 (en) * 2005-10-14 2010-11-30 Lg Electronics Inc. Method for displaying multimedia contents and mobile communications terminal capable of implementing the same
US20100138780A1 (en) * 2008-05-20 2010-06-03 Adam Marano Methods and systems for using external display devices with a mobile computing device
US8458255B2 (en) * 2008-06-20 2013-06-04 Sharp Kabushiki Kaisha Data output device, data providing device, data output system, data output device control method, and data providing device control method
US20100245267A1 (en) * 2009-03-31 2010-09-30 Lg Electronics Inc. Mobile terminal and method of controlling the same
US20100257473A1 (en) * 2009-04-01 2010-10-07 Samsung Electronics Co., Ltd. Method for providing gui and multimedia device using the same

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130160095A1 (en) * 2011-12-14 2013-06-20 Nokia Corporation Method and apparatus for presenting a challenge response input mechanism
US20130246905A1 (en) * 2012-03-19 2013-09-19 Kabushiki Kaisha Toshiba Information generator, information output device, and recording medium
US20140049493A1 (en) * 2012-08-17 2014-02-20 Konica Minolta, Inc. Information device, and computer-readable storage medium for computer program
US20140145969A1 (en) * 2012-11-29 2014-05-29 Research In Motion Limited System and method for graphic object management in a large-display area computing device
US9513795B2 (en) * 2012-11-29 2016-12-06 Blackberry Limited System and method for graphic object management in a large-display area computing device
US20140210713A1 (en) * 2013-01-31 2014-07-31 Samsung Electronics Co., Ltd. Method for controlling display of pointer and displaying the pointer, and apparatus thereof
US9600088B2 (en) * 2013-01-31 2017-03-21 Samsung Electronics Co., Ltd. Method and apparatus for displaying a pointer on an external display
CN105027128A (en) * 2013-02-28 2015-11-04 通用电气公司 Handheld medical imaging apparatus with cursor pointer control
US20160004330A1 (en) * 2013-02-28 2016-01-07 General Electric Company Handheld medical imaging apparatus with cursor pointer control
US10139925B2 (en) * 2013-03-04 2018-11-27 Microsoft Technology Licensing, Llc Causing specific location of an object provided to a device
US20140247207A1 (en) * 2013-03-04 2014-09-04 Microsoft Corporation Causing Specific Location of an Object Provided to a Device
US20140320423A1 (en) * 2013-04-29 2014-10-30 Srikanth Kambhatla Supporting Keyboard and Mouse Over Embedded DisplayPort Without Using a Universal Serial Bus
US9417726B2 (en) * 2013-04-29 2016-08-16 Intel Corporation Supporting keyboard and mouse over embedded displayport without using a universal serial bus
US20150230038A1 (en) * 2014-02-07 2015-08-13 Boe Technology Group Co., Ltd. Information display method, information display device, and display apparatus
US9439016B2 (en) * 2014-02-07 2016-09-06 Boe Technology Group Co., Ltd. Information display method, information display device, and display apparatus
CN106062696A (en) * 2014-03-31 2016-10-26 惠普发展公司,有限责任合伙企业 Three-part gesture
US20150355611A1 (en) * 2014-06-06 2015-12-10 Honeywell International Inc. Apparatus and method for combining visualization and interaction in industrial operator consoles
US9792082B1 (en) 2014-06-27 2017-10-17 X Development Llc Streaming display data from a mobile device using backscatter communications
US9602191B2 (en) 2014-06-27 2017-03-21 X Development Llc Streaming display data from a mobile device using backscatter communications
WO2015199851A1 (en) * 2014-06-27 2015-12-30 Google Inc. Streaming display data from a mobile device using backscatter communications
US10448111B2 (en) 2014-09-24 2019-10-15 Microsoft Technology Licensing, Llc Content projection
CN106716355A (en) * 2014-09-24 2017-05-24 微软技术许可有限责任公司 Interactive text preview
US10824531B2 (en) 2014-09-24 2020-11-03 Microsoft Technology Licensing, Llc Lending target device resources to host device computing environment
US20180007104A1 (en) 2014-09-24 2018-01-04 Microsoft Corporation Presentation of computing environment on multiple devices
US10635296B2 (en) 2014-09-24 2020-04-28 Microsoft Technology Licensing, Llc Partitioned application presentation across devices
US20160085396A1 (en) * 2014-09-24 2016-03-24 Microsoft Corporation Interactive text preview
US10277649B2 (en) 2014-09-24 2019-04-30 Microsoft Technology Licensing, Llc Presentation of computing environment on multiple devices
US20160343350A1 (en) * 2015-05-19 2016-11-24 Microsoft Technology Licensing, Llc Gesture for task transfer
US10102824B2 (en) * 2015-05-19 2018-10-16 Microsoft Technology Licensing, Llc Gesture for task transfer
CN105025124A (en) * 2015-06-25 2015-11-04 尚诚德 Intelligent mobile phone capable of matching mobile phone computer
US20190156788A1 (en) * 2017-11-21 2019-05-23 Samsung Electronics Co., Ltd. Method for configuring input interface and electronic device using same
KR20190058067A (en) * 2017-11-21 2019-05-29 삼성전자주식회사 Method for configuring input interface and electronic device using the same
EP3486764A1 (en) * 2017-11-21 2019-05-22 Samsung Electronics Co., Ltd. Method for configuring input interface and electronic device using same
US10733959B2 (en) 2017-11-21 2020-08-04 Samsung Electronics Co., Ltd. Method for configuring input interface and electronic device using same
KR102500666B1 (en) * 2017-11-21 2023-02-16 삼성전자 주식회사 Method for configuring input interface and electronic device using the same
EP3937149A1 (en) * 2020-07-08 2022-01-12 Hongfujin Precision Electronics (Tianjin) Co., Ltd. Small and portable input device with functions for starting and controlling document display

Also Published As

Publication number Publication date
US20180121051A1 (en) 2018-05-03
CN107678675A (en) 2018-02-09
US20210208757A1 (en) 2021-07-08
CN107678675B (en) 2021-09-21
US10956020B2 (en) 2021-03-23
WO2013016383A1 (en) 2013-01-31
US11237721B2 (en) 2022-02-01
CN103703452A (en) 2014-04-02

Similar Documents

Publication Publication Date Title
US11237721B2 (en) Techniques to display an input device on a mobile device
US11561754B2 (en) Electronic device and method for displaying and transmitting images thereof
US20170235435A1 (en) Electronic device and method of application data display therefor
US10659200B2 (en) Companion application for activity cooperation
US9591680B2 (en) Method for controlling system including electronic tag, mobile device, and display device, and mobile device and display device of the same
EP3040804B1 (en) Electronic device for controlling power and method therefor
US8803828B2 (en) Method for controlling operation of touch panel and portable terminal supporting the same
US20160133052A1 (en) Virtual environment for sharing information
US10452232B2 (en) Method and an electronic device for one-hand user interface
US20150304336A1 (en) Multi-screen interaction method of multimedia resource and terminal device
US10552182B2 (en) Multiple display device and method of operating the same
US20160004425A1 (en) Method of displaying graphic user interface and electronic device implementing same
US20160351047A1 (en) Method and system for remote control of electronic device
US20140359664A1 (en) Display apparatus, method of controlling display apparatus, and computer-readable recording medium
CN106569758A (en) Wireless projection screen method and device
US20160182603A1 (en) Browser Display Casting Techniques
US10469645B2 (en) Method and apparatus for creating communication group
US20150293686A1 (en) Apparatus and method for controlling home screen
CN106201393B (en) Information processing method and electronic equipment
US9612790B2 (en) Method and electronic device for providing frame information
US10241634B2 (en) Method and apparatus for processing email in electronic device
KR20140090393A (en) Method for providing user interface of terminal, system and apparatus thereof
EP3319328A1 (en) Streaming service method and device
US9848287B1 (en) Adaptable schema based payloads
KR20150081471A (en) Operating Method For Screen Data and Electronic Device Supporting The Same

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTEL CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TENG, ARTHER SING HOOK;REEL/FRAME:026641/0532

Effective date: 20110725

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION