US11126396B2 - Audio output device selection - Google Patents
Audio output device selection Download PDFInfo
- Publication number
- US11126396B2 US11126396B2 US16/369,468 US201916369468A US11126396B2 US 11126396 B2 US11126396 B2 US 11126396B2 US 201916369468 A US201916369468 A US 201916369468A US 11126396 B2 US11126396 B2 US 11126396B2
- Authority
- US
- United States
- Prior art keywords
- application window
- display devices
- connected display
- audio output
- output device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
- G06F3/1446—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2300/00—Aspects of the constitution of display devices
- G09G2300/02—Composition of display devices
- G09G2300/026—Video wall, i.e. juxtaposition of a plurality of screens to create a display screen of bigger dimensions
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
- G09G2340/0478—Horizontal positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/04—Display device controller operating with a plurality of display units
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
- G09G2370/042—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller for monitor identification
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/14—Use of low voltage differential signaling [LVDS] for display data communication
Definitions
- Information handling devices for example laptop and personal computers, smart phones, tablet devices, televisions, other electronic devices, and the like, may comprise one or more integrally or operatively coupled audio output devices.
- the audio output devices may provide audio for events originating from active applications resident on a display screen of the device.
- the audio output device corresponding to the display screen may be set by a manufacturer and/or assigned by a user.
- one aspect provides a method, comprising: detecting, at an information handling device, a position of an application window on one of at least two connected display devices operatively coupled to the information handling device; determining, using a processor, an audio output device associated with the one of the at least two connected display devices; and directing audio originating from the application window to the audio output device.
- an information handling device comprising: at least two connected display devices; an audio output device; a processor; a memory device that stores instructions executable by the processor to: detect a position of an application window on one of the at least two connected display devices operatively coupled to the information handling device; determine an audio output device associated with the one of the at least two connected display devices; and direct audio originating from the application window to the audio output device.
- a further aspect provides a product, comprising: a storage device that stores code, the code being executable by a processor and comprising: code that detects a position of an application window on one of at least two connected display devices; code that determines an audio output device associated with the one of the at least two connected display devices; and code that directs audio originating from the application window to the audio output device.
- FIG. 1 illustrates an example of information handling device circuitry.
- FIG. 2 illustrates another example of information handling device circuitry.
- FIG. 3 illustrates an example method of direct audio originating from an application window to an audio output device.
- FIG. 4 illustrates an example method of switching an audio output device associated with a display device.
- FIG. 5 illustrates an example embodiment of a multi-display device system.
- FIG. 6 illustrates an example embodiment of a multi-display device system.
- a manufacturer may designate a default audio output device at the time of production or, alternatively, a user may manually assign a default audio output device within an application or operating system interface (e.g., by interacting with a settings menu, etc.).
- a user may desire to have output audio derive from the display comprising the sound producing source (e.g., an application window playing a movie may have sound associated with the movie derive from display device on which the application window was displayed, etc.).
- audio streams may be directed to different audio output devices based upon the type of audio stream.
- a user may elect to have audio associated with a VOIP call be directed to a headset whereas audio associated with all other audio streams (e.g., audio from a media player, a game, another application, etc.) be directed to a separate audio output device (e.g., another device, dedicated speakers, etc.).
- audio associated with all other audio streams e.g., audio from a media player, a game, another application, etc.
- a separate audio output device e.g., another device, dedicated speakers, etc.
- an embodiment may dynamically direct audio to a particular audio output device based upon a display device the audio-producing source is resident on.
- a multi-display system may exist in which two or more displays are operatively coupled together, each of which comprising an integrated audio output device (e.g., a dual-screen monitor PC setup, etc.).
- An embodiment may first identify the position of an application window in the multi-display system. In this regard, an embodiment may identify the display device in the multi-display system on which the application window is being displayed. An embodiment may then direct audio generated from the application window to an audio output device associated with the display device displaying the application window (e.g., to an integral audio output source of the display device, etc.).
- an embodiment may automatically switch an output designation of the audio from the first audio output device to another audio output device associated with the other display device. Such a method may negate the need for a user to manually navigate through computer preferences and select an audio output device correlated to a relevant display device.
- FIG. 1 includes a system on a chip design found for example in tablet or other mobile computing platforms.
- Software and processor(s) are combined in a single chip 110 .
- Processors comprise internal arithmetic units, registers, cache memory, busses, I/O ports, etc., as is well known in the art. Internal busses and the like depend on different vendors, but essentially all the peripheral devices ( 120 ) may attach to a single chip 110 .
- the circuitry 100 combines the processor, memory control, and I/O controller hub all into a single chip 110 .
- systems 100 of this type do not typically use SATA or PCI or LPC. Common interfaces, for example, include SDIO and I2C.
- power management chip(s) 130 e.g., a battery management unit, BMU, which manage power as supplied, for example, via a rechargeable battery 140 , which may be recharged by a connection to a power source (not shown).
- BMU battery management unit
- a single chip, such as 110 is used to supply BIOS like functionality and DRAM memory.
- System 100 typically includes one or more of a WWAN transceiver 150 and a WLAN transceiver 160 for connecting to various networks, such as telecommunications networks and wireless Internet devices, e.g., access points. Additionally, devices 120 are commonly included, e.g., an image sensor such as a camera, audio capture device such as a microphone, audio output device such as an integral or operatively coupled speaker, etc. System 100 often includes one or more touch screens 170 for data input and display/rendering. System 100 also typically includes various memory devices, for example flash memory 180 and SDRAM 190 .
- FIG. 2 depicts a block diagram of another example of information handling device circuits, circuitry or components.
- the example depicted in FIG. 2 may correspond to computing systems such as the THINKPAD series of personal computers sold by Lenovo (US) Inc. of Morrisville, N.C., or other devices.
- embodiments may include other features or only some of the features of the example illustrated in FIG. 2 .
- FIG. 2 includes a so-called chipset 210 (a group of integrated circuits, or chips, that work together, chipsets) with an architecture that may vary depending on manufacturer (for example, INTEL, AMD, ARM, etc.).
- INTEL is a registered trademark of Intel Corporation in the United States and other countries.
- AMD is a registered trademark of Advanced Micro Devices, Inc. in the United States and other countries.
- ARM is an unregistered trademark of ARM Holdings plc in the United States and other countries.
- the architecture of the chipset 210 includes a core and memory control group 220 and an I/O controller hub 250 that exchanges information (for example, data, signals, commands, etc.) via a direct management interface (DMI) 242 or a link controller 244 .
- DMI direct management interface
- the DMI 242 is a chip-to-chip interface (sometimes referred to as being a link between a “northbridge” and a “southbridge”).
- the core and memory control group 220 include one or more processors 222 (for example, single or multi-core) and a memory controller hub 226 that exchange information via a front side bus (FSB) 224 ; noting that components of the group 220 may be integrated in a chip that supplants the conventional “northbridge” style architecture.
- processors 222 comprise internal arithmetic units, registers, cache memory, busses, I/O ports, etc., as is well known in the art.
- the memory controller hub 226 interfaces with memory 240 (for example, to provide support for a type of RAM that may be referred to as “system memory” or “memory”).
- the memory controller hub 226 further includes a low voltage differential signaling (LVDS) interface 232 for a display device 292 (for example, a CRT, a flat panel, touch screen, etc.).
- a block 238 includes some technologies that may be supported via the LVDS interface 232 (for example, serial digital video, HDMI/DVI, display port).
- the memory controller hub 226 also includes a PCI-express interface (PCI-E) 234 that may support discrete graphics 236 .
- PCI-E PCI-express interface
- the I/O hub controller 250 includes a SATA interface 251 (for example, for HDDs, SDDs, etc., 280 ), a PCI-E interface 252 (for example, for wireless connections 282 ), a USB interface 253 (for example, for devices 284 such as a digitizer, keyboard, mice, cameras, phones, microphones, storage, other connected devices, etc.), a network interface 254 (for example, LAN), a GPIO interface 255 , a LPC interface 270 (for ASICs 271 , a TPM 272 , a super I/O 273 , a firmware hub 274 , BIOS support 275 as well as various types of memory 276 such as ROM 277 , Flash 278 , and NVRAM 279 ), a power management interface 261 , a clock generator interface 262 , an audio interface 263 (for example, for speakers 294 ), a TCO interface 264 , a system management bus interface 265 , and
- the system upon power on, may be configured to execute boot code 290 for the BIOS 268 , as stored within the SPI Flash 266 , and thereafter processes data under the control of one or more operating systems and application software (for example, stored in system memory 240 ).
- An operating system may be stored in any of a variety of locations and accessed, for example, according to instructions of the BIOS 268 .
- a device may include fewer or more features than shown in the system of FIG. 2 .
- Information handling device circuitry may be used in devices such as laptops, televisions, personal computer devices generally, and/or other electronic devices that may utilize two or more operatively coupled display screens.
- the circuitry outlined in FIG. 1 may be implemented in a laptop embodiment, whereas the circuitry outlined in FIG. 2 may be implemented in a personal computer system.
- an embodiment may detect the position of an application window on a display device in a system and thereafter output an audio stream from the audio output device associated with the display device displaying the application window.
- an embodiment may detect a position of an application window on a display device in a multi-display system.
- a multi-display system may correspond to a system in which two or more display devices are utilized to convey information to a user.
- the two or more display devices may be integrally or operatively coupled.
- a multi-display system may correspond to a two or three monitor system, such as illustrated in FIG.
- a multi-display system may correspond to two or more wirelessly connected devices (e.g., a laptop and a smart TV, etc.) that may not necessarily be positioned proximate to each other but may nevertheless allow for seamless information transfer between devices (e.g., by beaming an application window from the laptop to the smart TV, etc.).
- the detection of the position of an application window on a display device may correspond to detecting a particular display screen, or screens, on which the application window is being displayed upon. For example, an embodiment may conclude that an application window is displayed on a display device if an entirety of the application window is on the display screen of the display device. Stated differently, the foregoing conclusion may be made if the borders of the application window are housed within the borders of the display screen. Alternatively, situations may arise where a portion of an application window may be displayed on one display screen and another portion of the application window may be simultaneously displayed on another display screen. In these situations, an embodiment may conclude that an application window is displayed on a display device if a majority of the application window is displayed on the display screen of the display device. For example, if an application window is simultaneously displayed across two screens, A and B, an embodiment may conclude that the application window is displayed on Screen A responsive to identifying that Screen A comprises a greater percentage of the application window than Screen B.
- a system may utilize coordinate pairs for detection of the position of the application window. More particularly, each display device in the multi-display system may represent a portion of a larger two-dimensional coordinate plane, or virtual grid (“grid”), the entirety of which spans across all of the connected display devices.
- the grid allows an embodiment to objectively identify the position of displayed contents, in terms of X and Y coordinate pairs, at any position in the grid.
- the grid may not be visible to a user and the grid values for each display device may be generated and/or adjusted automatically as display devices are added and/or subtracted from the system.
- an illustration of a three monitor setup 50 is provided. Solely for description purposes, the boundary defining coordinate pairs for each display device ( 51 , 52 , and 53 ), and an application window 54 displayed on the first display device 51 , are presented. As can be seen in the figure, an origin point 55 (i.e., the coordinate pair (0, 0)) may exist at the upper left edge of the second display device 52 . This origin point 55 may be an origin for the entirety of the grid spanning across the first 51 , the second 52 , and the third 53 display devices. Stated differently, only a single origin point may exist across all of the connected display devices from which coordinate pairs for any point in the grid may be generated. Accordingly, the set of coordinate pairs defining the position of the application window 54 may be particular to a single location on the grid. In this way, the specific position for any displayed application window in the system may be accurately identified, regardless of the amount of connected display devices.
- an embodiment may determine an audio output device associated with the display device determined to be displaying the application window.
- each display device may have a designated audio output device designation.
- each display device may have a designated audio output device that may provide audio corresponding to contents displayed on each display.
- the audio output device designation may be set by a manufacturer and/or adjusted by a user. Knowledge of the audio output designation may be stored at an accessible database (e.g., stored locally on the device or remotely on another device or server, etc.).
- a display device's designated audio output device may be the audio output device that is physically integrated into the display device.
- a display device's designated audio output device may be another, remote audio output device.
- multiple audio output devices may be tasked with simultaneously providing audio output for an application window when the application window is identified as being positioned on a particular display device. For example, in a three monitor setup where each monitor comprises an integrated audio output device, such as illustrated in FIG. 5 , an embodiment may direct the audio to both audio output devices integral to the first and third display device when an application window is identified as being positioned on the second display device.
- an embodiment may, at 303 , direct audio to audio output devices by referring to predetermined settings. Conversely, responsive to determining, at 302 , an audio output device associated with the display device determined to be displaying the application window, an embodiment may, at 304 , direct audio originating from the application window to the designated audio output device. In an embodiment, the directing of the audio may be based on one or more user-defined settings. As an example and with reference to FIG. 5 , if the application window 54 was launched and was automatically opened on the first display device 51 , an embodiment may direct audio originating from the application window 54 to the audio output device associated with the first display device 51 .
- an embodiment may direct audio originating from the application window 54 to the audio output device associated with the second display device 52 .
- an embodiment may direct audio originating from the application window 54 to two or more audio output devices (e.g., an audio output device associated with the first display device 51 and the third display device 53 , etc.).
- steps 401 - 402 are substantially similar to steps 301 - 302 of the foregoing disclosure, these steps are not repeated here but can be found above with reference to the disclosure associated with steps 301 - 302 .
- an embodiment may identify whether the application window has been moved to another display device.
- An application window may be moved between display devices by, for example, manually dragging the application window from one display device to another (e.g., using a mouse, a stylus, etc.).
- the identification that the application window has been moved to another display device may comprise identifying that the coordinate pairs of the application window (e.g., the edge coordinate pairs, etc.) now correspond to coordinate pairs housed within the display screen of another display device. Additionally or alternatively, the identification that the application window has been moved to another display device may comprise identifying that another display device now comprises a majority percentage of the application window.
- the application window 54 may be originally displayed on the first display device 51 at edge coordinates ( ⁇ 638, 267) to ( ⁇ 183, 807).
- An embodiment may be able to identify that the application window 54 has moved to another display device if an embodiment detects that the coordinates for the application window 54 have been adjusted to, for example, (1300, 267) to (1755, 807), which would place the entirety of application window 54 onto a portion of the grid associated with the second display device 52 .
- an embodiment may identify that the application window 54 has not moved to another display device if an embodiment detects that the coordinates for the application window 54 have been adjusted to, for example, ( ⁇ 638, 567) to ( ⁇ 183, 1107), which would place the application window 54 onto another portion of the grid associated with the first display device 52 .
- an embodiment may identify that the application window 64 has been moved to another display device from its origin point (i.e., display device 61 where the original application window position is reflected by dashed lines) if an embodiment detects that the majority of the application window 64 is associated with a particular display screen. For instance, if the coordinates for the application window 64 have been adjusted to, for example, (2,055, 267) to (2,510, 807), the application window 54 may now be partially displayed across two display devices (i.e., the second display device 62 and the third display device 63 ).
- an embodiment may conclude that the application window 64 has been moved to the third display device 63 . Additionally or alternatively, an embodiment may identify that the application window 64 has been moved to the third display device 63 by identifying that third display device 63 comprises a greater percentage of the application window 64 than the second display device 62 (i.e., the second display device 62 comprises approximately 23% of the application window whereas the third display device 63 comprises approximately 77% of the application window 64 ).
- an embodiment may, at 404 , maintain the current audio output designation. Conversely, responsive to identifying, at 403 , that the application window has moved to another display device, an embodiment may, at 405 , switch an output designation of the audio from the audio output device associated with the first display device to another audio output device associated with the second display device.
- the audio output device associated with the second display device may be an audio output device integrated into the second display device, may be a remote audio output device operatively coupled to the second display, may be a plurality of audio output devices (e.g., audio may be directed to audio output devices integrally or operatively coupled to at least two monitors in a dual or multi-monitor setup, etc.).
- the switching of the audio stream may occur automatically and without any manual user input.
- an embodiment may dynamically and seamlessly switch output of the audio to another audio output device associated with the other display device without pausing or interrupting the audio flow.
- an embodiment may refer back to predetermined user settings that identify preferred audio output devices for different applications or application types (e.g., media playing applications, audio communications applications, etc.).
- an embodiment may detect the position of an application window and identify a display device in a multi-display system that corresponds to the position. An embodiment may then direct audio to an integral or operatively coupled audio output device that is associated with the display device displaying the application window. If the application window is moved to another display device, an embodiment may then dynamically adjust an output designation of the audio from the audio output device associated with the first display device to another audio output device associated with the destination display device. Such a method may allow audio to be provided to a user in a more contextually relevant way. Additionally, such a method may eliminate the current requirement for manual adjustment and designation of audio output settings each time an application window is moved between display screens.
- aspects may be embodied as a system, method or device program product. Accordingly, aspects may take the form of an entirely hardware embodiment or an embodiment including software that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects may take the form of a device program product embodied in one or more device readable medium(s) having device readable program code embodied therewith.
- a storage device may be, for example, a system, apparatus, or device (e.g., an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device) or any suitable combination of the foregoing.
- a storage device/medium include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
- a storage device is not a signal and “non-transitory” includes all media except signal media.
- Program code embodied on a storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, et cetera, or any suitable combination of the foregoing.
- Program code for carrying out operations may be written in any combination of one or more programming languages.
- the program code may execute entirely on a single device, partly on a single device, as a stand-alone software package, partly on single device and partly on another device, or entirely on the other device.
- the devices may be connected through any type of connection or network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made through other devices (for example, through the Internet using an Internet Service Provider), through wireless connections, e.g., near-field communication, or through a hard wire connection, such as over a USB connection.
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- Example embodiments are described herein with reference to the figures, which illustrate example methods, devices and program products according to various example embodiments. It will be understood that the actions and functionality may be implemented at least in part by program instructions. These program instructions may be provided to a processor of a device, a special purpose information handling device, or other programmable data processing device to produce a machine, such that the instructions, which execute via a processor of the device implement the functions/acts specified.
Abstract
Description
Claims (16)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/369,468 US11126396B2 (en) | 2019-03-29 | 2019-03-29 | Audio output device selection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/369,468 US11126396B2 (en) | 2019-03-29 | 2019-03-29 | Audio output device selection |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200310743A1 US20200310743A1 (en) | 2020-10-01 |
US11126396B2 true US11126396B2 (en) | 2021-09-21 |
Family
ID=72607617
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/369,468 Active US11126396B2 (en) | 2019-03-29 | 2019-03-29 | Audio output device selection |
Country Status (1)
Country | Link |
---|---|
US (1) | US11126396B2 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102291021B1 (en) * | 2017-03-17 | 2021-08-18 | 삼성전자주식회사 | Electronic device for controlling audio output and operating mehtod thereof |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090140986A1 (en) * | 2007-11-30 | 2009-06-04 | Nokia Corporation | Method, apparatus and computer program product for transferring files between devices via drag and drop |
US20100313143A1 (en) * | 2009-06-09 | 2010-12-09 | Samsung Electronics Co., Ltd. | Method for transmitting content with intuitively displaying content transmission direction and device using the same |
US20120038541A1 (en) * | 2010-08-13 | 2012-02-16 | Lg Electronics Inc. | Mobile terminal, display device and controlling method thereof |
US20120131458A1 (en) * | 2010-11-19 | 2012-05-24 | Tivo Inc. | Flick to Send or Display Content |
US20130194270A1 (en) * | 2012-01-31 | 2013-08-01 | Jeffrey Joel Walls | Remote Graphics Corresponding to Region |
US20140258880A1 (en) * | 2013-03-07 | 2014-09-11 | Nokia Corporation | Method and apparatus for gesture-based interaction with devices and transferring of contents |
US20150032809A1 (en) * | 2013-07-26 | 2015-01-29 | Cisco Technology, Inc. | Conference Session Handoff Between Devices |
US20170061628A1 (en) * | 2015-09-01 | 2017-03-02 | Electronics And Telecommunications Research Institute | Screen position sensing method in multi display system, content configuring method, watermark image generating method for sensing screen position server, and display terminal |
US20170205980A1 (en) * | 2016-01-18 | 2017-07-20 | Microsoft Technology Licensing, Llc | Method and an apparatus for providing a multitasking view |
US20180374493A1 (en) * | 2016-03-04 | 2018-12-27 | Yamaha Corporation | System, control method, and control terminal |
-
2019
- 2019-03-29 US US16/369,468 patent/US11126396B2/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090140986A1 (en) * | 2007-11-30 | 2009-06-04 | Nokia Corporation | Method, apparatus and computer program product for transferring files between devices via drag and drop |
US20100313143A1 (en) * | 2009-06-09 | 2010-12-09 | Samsung Electronics Co., Ltd. | Method for transmitting content with intuitively displaying content transmission direction and device using the same |
US20120038541A1 (en) * | 2010-08-13 | 2012-02-16 | Lg Electronics Inc. | Mobile terminal, display device and controlling method thereof |
US20120131458A1 (en) * | 2010-11-19 | 2012-05-24 | Tivo Inc. | Flick to Send or Display Content |
US20130194270A1 (en) * | 2012-01-31 | 2013-08-01 | Jeffrey Joel Walls | Remote Graphics Corresponding to Region |
US20140258880A1 (en) * | 2013-03-07 | 2014-09-11 | Nokia Corporation | Method and apparatus for gesture-based interaction with devices and transferring of contents |
US20150032809A1 (en) * | 2013-07-26 | 2015-01-29 | Cisco Technology, Inc. | Conference Session Handoff Between Devices |
US20170061628A1 (en) * | 2015-09-01 | 2017-03-02 | Electronics And Telecommunications Research Institute | Screen position sensing method in multi display system, content configuring method, watermark image generating method for sensing screen position server, and display terminal |
US20170205980A1 (en) * | 2016-01-18 | 2017-07-20 | Microsoft Technology Licensing, Llc | Method and an apparatus for providing a multitasking view |
US20180374493A1 (en) * | 2016-03-04 | 2018-12-27 | Yamaha Corporation | System, control method, and control terminal |
Also Published As
Publication number | Publication date |
---|---|
US20200310743A1 (en) | 2020-10-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11036260B2 (en) | Keyboard attachment to foldable device | |
US9813662B2 (en) | Transfer to target disambiguation | |
US20150212699A1 (en) | Handedness for hand-held devices | |
US20150363008A1 (en) | Displaying a user input modality | |
US10296279B2 (en) | Displaying images across multiple displays | |
US11126396B2 (en) | Audio output device selection | |
US10416759B2 (en) | Eye tracking laser pointer | |
US11825164B2 (en) | Media playback device selection | |
US20180181289A1 (en) | Sizing applications based on display parameters | |
US10764511B1 (en) | Image version selection based on device orientation | |
US11237641B2 (en) | Palm based object position adjustment | |
US10298769B2 (en) | Call transfer between devices | |
US11886888B2 (en) | Reduced application view during loading | |
US11126479B2 (en) | Disturbance setting adjustment | |
US20220050149A1 (en) | Command provision via magnetic field variation | |
US11017746B2 (en) | Auxiliary display scaling factor | |
US20230195233A1 (en) | Fixed user interface navigation | |
US11698767B1 (en) | Designated share window for sharing content data | |
US10853924B2 (en) | Offset camera lens | |
US20230185368A1 (en) | Gazed based cursor adjustment | |
US20230195687A1 (en) | System setting adjustment based on location | |
US10547939B1 (en) | Pickup range control | |
US10546428B2 (en) | Augmented reality aspect indication for electronic device | |
US20230199383A1 (en) | Microphone setting adjustment based on user location | |
US20220171530A1 (en) | Displaying a user input modality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LENOVO (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NICHOLSON, JOHN WELDON;CROMER, DARYL;LOCKER, HOWARD;REEL/FRAME:048740/0128 Effective date: 20190329 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: LENOVO PC INTERNATIONAL LIMITED, HONG KONG Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LENOVO (SINGAPORE) PTE LTD;REEL/FRAME:060638/0044 Effective date: 20130401 |