WO2013109662A1 - Reconnaissance de type d'entrée et de mode tactile - Google Patents
Reconnaissance de type d'entrée et de mode tactile Download PDFInfo
- Publication number
- WO2013109662A1 WO2013109662A1 PCT/US2013/021793 US2013021793W WO2013109662A1 WO 2013109662 A1 WO2013109662 A1 WO 2013109662A1 US 2013021793 W US2013021793 W US 2013021793W WO 2013109662 A1 WO2013109662 A1 WO 2013109662A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- input
- display
- touch input
- input mode
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0227—Cooperation and interconnection of the input arrangement with other functional units of a computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
Definitions
- touch input e.g., mouse, pen, trackball
- hardware based input e.g., mouse, pen, trackball
- a touch input mode may be entered and exited automatically and/or manually.
- user interface (UI) elements are optimized for touch input.
- the touch input mode is exited, the user interface (UI) elements are optimized for hardware based input.
- a user may enter the touch input mode by manually selecting a user interface element and/or by entering touch input.
- Settings may be configured that specify conditions upon which the touch input mode is entered/exited.
- the touch input mode may be configured to be automatically entered upon undocking a computing device, receiving touch input when in the hardware based input mode, and the like.
- the touch input mode may be configured to be
- FIGURE 1 illustrates an exemplary computing environment
- FIGURE 2 illustrates an exemplary system for changing an input mode
- FIGURE 3 shows an illustrative processes for switching modes between a touch input mode and a hardware based input mode
- FIGURE 4 illustrates a diagram showing different input that may affect a determination of an input mode
- FIGURE 5 shows a system architecture used in determining an input mode
- FIGURE 6 shows an exemplary UI for selecting an input mode
- FIGURE 7 shows UI elements sized for hardware based input and UI elements sized for touch input
- FIGURE 8 illustrates an exemplary sizing table that may be used in determining a size of UI elements.
- FIGURE 1 and the corresponding discussion are intended to provide a brief, general description of a suitable computing environment in which embodiments may be implemented.
- program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types.
- Other computer system configurations may also be used, including hand-held devices, multiprocessor systems, microprocessor-based or
- programmable consumer electronics minicomputers, mainframe computers, and the like.
- Distributed computing environments may also be used where tasks are performed by remote processing devices that are linked through a communications network.
- program modules may be located in both local and remote memory storage devices.
- FIGURE 1 an illustrative computer environment for a computer 100 utilized in the various embodiments will be described.
- the computer environment shown in FIGURE 1 includes computing devices that each may be configured as a mobile computing device (e.g., phone, tablet, netbook, laptop), server, a desktop, or some other type of computing device and includes a central processing unit 5 ("CPU"), a system memory 7, including a random access memory 9 (“RAM”) and a read- only memory (“ROM”) 10, and a system bus 12 that couples the memory to the central processing unit (“CPU”) 5.
- CPU central processing unit 5
- RAM random access memory 9
- ROM read- only memory
- the computer 100 further includes a mass storage device 14 for storing an operating system 16, application(s) 24 (e.g., productivity application, Web Browser, and the like), program modules 25 and UI manager 26 which will be described in greater detail below.
- the mass storage device 14 is connected to the CPU 5 through a mass storage controller (not shown) connected to the bus 12.
- the mass storage device 14 and its associated computer-readable media provide non-volatile storage for the computer 100.
- computer-readable media can be any available media that can be accessed by the computer 100.
- Computer-readable media may comprise computer storage media and communication media.
- Computer storage media includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
- Computer storage media includes, but is not limited to, RAM, ROM, Erasable Programmable Read Only Memory (“EPROM”), Electrically Erasable Programmable Read Only Memory (“EEPROM”), flash memory or other solid state memory technology, CD-ROM, digital versatile disks (“DVD”), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 100.
- Computer 100 operates in a networked environment using logical connections to remote computers through a network 18, such as the Internet.
- the computer 100 may connect to the network 18 through a network interface unit 20 connected to the bus 12.
- the network connection may be wireless and/or wired.
- the network interface unit 20 may also be utilized to connect to other types of networks and remote computer systems.
- the computer 100 may also include an input/output controller 22 for receiving and processing input from a number of other devices, including a keyboard, mouse, a touch input device, or electronic stylus (not shown in FIGURE 1). Similarly, an input/output controller 22 may provide input/output to a display screen 23, a printer, or other type of output device.
- a touch input device may utilize any technology that allows single/multi-touch input to be recognized (touching/non-touching).
- the technologies may include, but are not limited to: heat, finger pressure, high capture rate cameras, infrared light, optic capture, tuned electromagnetic induction, ultrasonic receivers, transducer microphones, laser rangefinders, shadow capture, and the like.
- the touch input device may be configured to detect near-touches (i.e. within some distance of the touch input device but not physically touching the touch input device).
- the touch input device may also act as a display.
- the input/output controller 22 may also provide output to one or more display screens 23, a printer, or other type of input/output device.
- a camera and/or some other sensing device may be operative to record one or more users and capture motions and/or gestures made by users of a computing device. Sensing device may be further operative to capture spoken words, such as by a
- the sensing device may comprise any motion detection device capable of detecting the movement of a user.
- a camera may comprise a MICROSOFT KINECT® motion capture device comprising a plurality of cameras and a plurality of microphones.
- Embodiments of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components/processes illustrated in the FIGURES may be integrated onto a single integrated circuit.
- SOC system-on-a-chip
- Such a SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or "burned") onto the chip substrate as a single integrated circuit.
- all/some of the functionality, described herein may be integrated with other components of the computing device/system 100 on the single integrated circuit (chip).
- a number of program modules and data files may be stored in the mass storage device 14 and RAM 9 of the computer 100, including an operating system 16 suitable for controlling the operation of a computer, such as the WINDOWS 8®, WINDOWS PHONE 7®, WINDOWS 7®, or WINDOWS SERVER® operating system from MICROSOFT CORPORATION of Redmond, Washington.
- the mass storage device 14 and RAM 9 may also store one or more program modules.
- the mass storage device 14 and the RAM 9 may store one or more application programs, such as a spreadsheet application, word processing application and/or other applications.
- the MICROSOFT OFFICE suite of applications is included.
- the application(s) may be client based and/or web based.
- a network service 27 may be used, such as: MICROSOFT WINDOWS LIVE, MICROSOFT OFFICE 365 or some other network based service.
- UI manager 26 is configured to change between an input mode that includes a touch input mode and a hardware based input mode.
- the input mode may be entered and exited automatically and/or manually.
- user interface (UI) elements are optimized for touch input.
- the touch input mode is exited, the user interface (UI) elements are optimized for hardware based input.
- a user may enter the touch input mode by manually selecting a user interface element and/or by entering touch input.
- Settings may be configured that specify conditions upon which the touch input mode is entered/exited.
- the touch input mode may be configured to be automatically entered upon undocking a computing device, receiving touch input when in the hardware based input mode, and the like.
- the touch input mode may be configured to be automatically exited upon docking a computing device, receiving hardware based input when in the touch input mode, and the like.
- the user interface elements (e.g., UI 28) that are displayed are based on the input mode. For example, a user may sometimes interact with application 24 using touch input and in other situations use hardware based input to interact with the application.
- UI manager 26 displays a user interface element optimized for touch input. For example, touch UI elements may be displayed: using formatting configured for touch input (e.g., changing a size, spacing); using a layout configured for touch input; displaying more/fewer options;
- the UI manager 26 displays UI elements for the application that are optimized for the hardware based input. For example, formatting configured for hardware based input may be used (e.g., hover based input may be used, text may be displayed smaller), more/fewer options displayed, and the like.
- UI manager 26 may be located externally from an application, e.g., a productivity application or some other application, as shown or may be a part of an application.
- UI manager 26 may be located internally/externally from an application. More details regarding the UI manager are disclosed below.
- FIGURE 2 illustrates an exemplary system for changing an input mode.
- system 200 includes service 210, UI manager 240, store 245, device 250 (e.g., desktop computer, tablet) and smart phone 230.
- device 250 e.g., desktop computer, tablet
- smart phone 230 e.g., smart phone
- service 210 is a cloud based and/or enterprise based service that may be configured to provide productivity services (e.g., MICROSOFT OFFICE 365 or some other cloud based/online service that is used to interact with items (e.g.,
- service 210 is a multi-tenant service that provides resources 215 and services to any number of tenants (e.g., Tenants 1-N).
- multi-tenant service 210 is a cloud based service that provides
- System 200 as illustrated comprises a touch screen input device/smart phone 230 that detects when a touch input has been received (e.g., a finger touching or nearly touching the touch screen) and device 250 that may support touch input and/or hardware based input such as a mouse, keyboard, and the like.
- device 250 is a computing device that includes a touch screen that may be attached/detached to keyboard 252, mouse 254 and/or other hardware based input devices.
- touch screen may be utilized that detects a user's touch input.
- the touch screen may include one or more layers of capacitive material that detects the touch input.
- Other sensors may be used in addition to or in place of the capacitive material.
- IR Infrared
- the touch screen is configured to detect objects that in contact with or above a touchable surface.
- the term “above” is used in this description, it should be understood that the orientation of the touch panel system is irrelevant. The term “above” is intended to be applicable to all such orientations.
- the touch screen may be configured to determine locations of where touch input is received (e.g., a starting point, intermediate points and an ending point).
- Actual contact between the touchable surface and the object may be detected by any suitable means, including, for example, by a vibration sensor or microphone coupled to the touch panel.
- a vibration sensor or microphone coupled to the touch panel.
- sensors to detect contact includes pressure-based mechanisms, micro-machined accelerometers, piezoelectric devices, capacitive sensors, resistive sensors, inductive sensors, laser vibrometers, and LED vibrometers.
- Content e.g., documents, files, UI definitions .
- a device e.g., smart phone 230, device 250 and/or at some other location (e.g., network store 245).
- touch screen input device/smart phone 230 shows an exemplary display 232 of a menu including UI elements configured for touch input.
- Device 250 shows a display 262 of a menu including UI elements configured for hardware based input and display 232 of a menu including UI elements configured for touch input when a user is using touch input to interact with device 250.
- display 232 and display 262 are shown at the same time. In operation, one of the menus is displayed based on the input being received.
- UI manager 240 is configured to display differently configured user interface elements for an application based on whether an input mode is set to touch input or the input mode is set to a hardware based input mode.
- a user may switch between a docking mode and an undocked mode.
- hardware based input may be used to interact with device 250 since keyboard 252 and mouse 254 are coupled to computing device 250.
- touch input may be used to interact with device 250.
- a user may also switch between the touch input mode and the hardware based input mode when device 250 is in the docked mode.
- UI manager 240 is configured to determine the input mode (touch/hardware) and to display the UI elements for touch when the user is interacting in the touch mode and to display the UI elements for hardware based input when the user is interacting using the hardware based input mode.
- the UI manager 240 may be part of the application the user is interacting with and/or separate from the application.
- the input mode may be switched automatically/manually. For example, a user may select a UI element (e.g., UI 241) to enter/exit touch mode.
- a UI element e.g., UI 241
- the UI element 241 is not displayed.
- a setting on the device e.g., a flag, property, ...) may be set to indicate a type of the device that may be used to determine when the device supports touch input. This setting may also be used to determine a default mode for the device (e.g., when the device is a tablet/slate device then the default mode may be set to the touch input mode).
- UI manager 240 displays the UI elements that are optimized for touch input.
- the input mode may be switched automatically in response to a type of detected input.
- UI manager 240 may switch from the hardware based input mode to touch input mode when touch input is received (e.g., a user's finger, hand) and may switch from the touch input mode to the hardware based input mode when a hardware based input, such as mouse input, docking event, is received.
- UI manager 240 disregards keyboard input and does not change the input mode from the touch input mode to a hardware based input mode in response to receiving keyboard input.
- UI manager 240 changes the input mode from the touch input mode to a hardware based input mode in response to receiving keyboard input.
- a user may disable the automatic switching of the modes. For example, a user may select a UI element to enable/disable the automatic switching of the input mode.
- UI manager may automatically switch the computing device to touch input mode since device 250 is no longer docked to the keyboard and mouse.
- UI manager 240 displays UI elements for the application that are adjusted for receiving the touch input. For example, menus (e.g., a ribbon), icons, and the like are sized larger as compared to when using hardware based input such that the UI elements are more touchable (e.g., can be selected more easily).
- UI elements may be displayed with more spacing, options in the menu may have their style changed, and some applications may adjust the layout of touch UI elements.
- the menu items displayed when using hardware based input are sized smaller and arranged horizontally as compared to touch based UI elements 232 that are sized larger and are spaced farther apart. Additional information may also be displayed next to the icon when in touch mode (e.g., 232) as compared to when receiving input using hardware based input. For example, when in hardware based input mode, hovering over an icon may display a "tooltip” that provides additional information about the UI element that is currently being hovered over. When in touch mode, the "tooltips" (e.g., "Keep Source Formatting", “Merge Formatting", and "Values Only”) are displayed along with the display of the icon.
- the user may manually turn off the touch input mode and/or touch input mode may be automatically switched to the hardware based input mode.
- the UI elements change in response to a last input method by a user.
- a last input type flag may be used to store the last input received.
- the input may be touch input or hardware based input.
- the touch input may be a user's fmger(s) or hand(s) and the hardware based input is a hardware device used for input such a mouse, trackball, pen, and the like.
- a pen is considered a touch input instead of a hardware based input (as configured by default).
- the last input type flag When a user clicks with a mouse, the last input type flag is set to "hardware" and when the user taps with a finger, the last input type flag is set to "touch.” While an application is running different pieces of UI adjust as they get triggered in based on the value of the last input type flag. The value of the last input type flag may also be queried by one or more different applications. The application(s) may use this information to determine when to display UI elements configured for touch and when to display UI elements configured for hardware based input.
- FIGURE 3 shows an illustrative processes for switching modes between a touch input mode and a hardware based input mode.
- process 300 moves to operation 310, where a user accesses an application.
- the application may be an operating environment, a client based application, a web based application, a hybrid application that uses both client
- the application may include any functionality that may be accessed using touch input and hardware based input.
- the input may be touch input or hardware based input.
- the touch input may be a user's finger(s) or hand(s).
- touch input may be defined to include one or more hardware input devices, such as a pen.
- the input may also be a selection of a UI element to change the input mode and/or to enable/disable automatic switching of modes.
- the input mode may be selected/toggled between a touch input mode and a hardware based input mode.
- the determination may be made automatically/manually. For example, when the computing device is initially docked, the determination may be initially set to hardware based input. When a touch device is undocked, the determination may be initially set to touch input. A user may also manually set the mode to touch based and/or hardware based input by selection of one or more UI elements.
- the input mode may also be set based on a last input method. For example, if a user touches the display, the mode may be touched to touch input until a hardware based input is received.
- the type of input last received is stored. For example, when the last input is hardware based (e.g., mouse, trackball) then the last input received is set to a hardware event and when the last input is a touch input, the last input received is set to a touch event. According to an embodiment, the type of last input disregards keyboard input.
- the last input is hardware based (e.g., mouse, trackball) then the last input received is set to a hardware event and when the last input is a touch input, the last input received is set to a touch event.
- the type of last input disregards keyboard input.
- the input mode is changed in response to the determination to change the mode.
- the input mode may be changed from the touch input mode to the hardware based input mode or from the hardware based input mode to the touch input mode.
- the UI elements that are configured for the input mode are displayed.
- the configuration of the UI elements may include adjusting one or more of: a spacing of elements, a size of the elements/text, options displayed, and associating hardware based input methods (e.g., hover) with touch based input displays.
- FIGURE 4 illustrates a diagram showing different input that may affect a determination of an input mode.
- the input mode may be changed in response to detection of touch input (410), hardware based input (420), a docking/undocking (440) of a computing device and/or a selection of a UI element (430).
- FIGURE 5 shows a system architecture used in determining an input mode, as described herein.
- Content used and displayed by the application e.g., application 1020
- the UI manager 26 may be stored at different locations.
- application 1020 may use/store data using directory services 1022, web portals 1024, mailbox services 1026, instant messaging stores 1028 and social networking sites 1030.
- the application 1020 may use any of these types of systems or the like.
- a server 1032 may be used to access sources and to prepare and display electronic items.
- server 1032 may access UI elements for application 1020 to display at a client (e.g., a browser or some other window).
- server 1032 may be a web server configured to provide productivity services (e.g., word processing, spreadsheet, presentation %) to one or more users. Server 1032 may use the web to interact with clients through a network 1008. Server 1032 may also comprise an application program. Examples of clients that may interact with server 1032 and a spreadsheet application include computing device 1002, which may include any general purpose personal computer, a tablet computing device 1004 and/or mobile computing device 1006 which may include smart phones. Any of these devices may obtain content from the store 1016.
- productivity services e.g., word processing, spreadsheet, presentation
- Server 1032 may use the web to interact with clients through a network 1008.
- Server 1032 may also comprise an application program. Examples of clients that may interact with server 1032 and a spreadsheet application include computing device 1002, which may include any general purpose personal computer, a tablet computing device 1004 and/or mobile computing device 1006 which may include smart phones. Any of these devices may obtain content from the store 1016.
- FIGURES 6-7 illustrate exemplary displays showing user interface elements configured for touch and hardware based input.
- FIGURES 6-7 are for exemplary purpose and are not intended to be limiting.
- FIGURE 6 shows an exemplary UI for selecting an input mode.
- display 610 shows selection of a UI element to change the input mode. For example, a user may select UI element 615 to toggle between input modes.
- the displays may be associated with a desktop application, a mobile application and/or a web-based application (e.g., displayed by a browser).
- the display may be displayed on a limited display device (e.g., smart phone, tablet) or on a larger screen device.
- Display 620 shows menu options for configuring whether or not to display UI element 615.
- FIGURE 7 shows UI elements sized for hardware based input and UI elements sized for touch input.
- Hardware based input UI elements e.g., 710, 720
- corresponding touch input UI elements e.g., 715, 725
- Display 730 shows selection of touch based UI element 725.
- the spacing of the menu option is display 730 are farther apart as compared to a corresponding hardware based input menu.
- FIGURE 8 illustrates an exemplary sizing table that may be used in determining a size of UI elements.
- Table 800 shows exemplary selections for setting a size of UI elements that are configured for touch. According to an embodiment, a target size of 9mm is selected with a minimum size of 6.5mm. Other target sizes may be selected.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Selon l'invention, un mode d'entrée tactile peut être activé et désactivé automatiquement et/ou manuellement. Lorsque le mode d'entrée tactile est activé, des éléments d'interface utilisateur sont optimisés pour une entrée tactile. Lorsque le mode d'entrée tactile est désactivé, les éléments d'interface utilisateur sont optimisés pour une entrée par des éléments matériels. Un utilisateur peut activer le mode d'entrée tactile en sélectionnant manuellement un élément d'interface utilisateur et/ou en effectuant une entrée tactile. Des paramètres peuvent être configurés pour spécifier les conditions d'activation/désactivation du mode d'entrée tactile. Par exemple, le mode d'entrée tactile peut être configuré pour être automatiquement activé lors de la déconnexion d'un dispositif informatique, ou en cas de réception d'une entrée tactile alors que le mode d'entrée par des éléments matériels est activé, et analogues. De la même manière, le mode d'entrée tactile peut être configuré pour être automatiquement désactivé lors de la connexion d'un dispositif informatique, ou en cas de réception d'une entrée par des éléments matériels alors que le mode d'entrée tactile est activé, et analogues.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/355,208 | 2012-01-20 | ||
US13/355,208 US9928562B2 (en) | 2012-01-20 | 2012-01-20 | Touch mode and input type recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013109662A1 true WO2013109662A1 (fr) | 2013-07-25 |
Family
ID=48796815
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2013/021793 WO2013109662A1 (fr) | 2012-01-20 | 2013-01-17 | Reconnaissance de type d'entrée et de mode tactile |
Country Status (2)
Country | Link |
---|---|
US (3) | US9928562B2 (fr) |
WO (1) | WO2013109662A1 (fr) |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8788977B2 (en) | 2008-11-20 | 2014-07-22 | Amazon Technologies, Inc. | Movement recognition as input mechanism |
US9928562B2 (en) | 2012-01-20 | 2018-03-27 | Microsoft Technology Licensing, Llc | Touch mode and input type recognition |
US9547375B2 (en) | 2012-10-10 | 2017-01-17 | Microsoft Technology Licensing, Llc | Split virtual keyboard on a mobile computing device |
JP5875510B2 (ja) * | 2012-12-10 | 2016-03-02 | 株式会社ソニー・コンピュータエンタテインメント | 電子機器、メニュー表示方法 |
US9035874B1 (en) | 2013-03-08 | 2015-05-19 | Amazon Technologies, Inc. | Providing user input to a computing device with an eye closure |
US9785240B2 (en) * | 2013-03-18 | 2017-10-10 | Fuji Xerox Co., Ltd. | Systems and methods for content-aware selection |
US9832452B1 (en) | 2013-08-12 | 2017-11-28 | Amazon Technologies, Inc. | Robust user detection and tracking |
JPWO2015029222A1 (ja) * | 2013-08-30 | 2017-03-02 | 富士通株式会社 | 情報処理装置,表示制御プログラム及び表示制御方法 |
US11199906B1 (en) * | 2013-09-04 | 2021-12-14 | Amazon Technologies, Inc. | Global user input management |
US10715611B2 (en) * | 2013-09-06 | 2020-07-14 | Adobe Inc. | Device context-based user interface |
WO2015112179A1 (fr) | 2014-01-27 | 2015-07-30 | Hewlett-Packard Development Company, L.P. | Sélectrion et commande d'interface d'imprimante |
KR20150101703A (ko) * | 2014-02-27 | 2015-09-04 | 삼성전자주식회사 | 디스플레이 장치 및 제스처 입력 처리 방법 |
JP6360390B2 (ja) * | 2014-08-26 | 2018-07-18 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム及び情報処理方法 |
US10048856B2 (en) | 2014-12-30 | 2018-08-14 | Microsoft Technology Licensing, Llc | Configuring a user interface based on an experience mode transition |
CN105159559A (zh) * | 2015-08-28 | 2015-12-16 | 小米科技有限责任公司 | 移动终端控制的方法及移动终端 |
US10474356B2 (en) | 2016-08-04 | 2019-11-12 | International Business Machines Corporation | Virtual keyboard improvement |
JP6517179B2 (ja) * | 2016-11-15 | 2019-05-22 | 京セラ株式会社 | 電子機器、プログラムおよび制御方法 |
US10254871B2 (en) | 2017-04-10 | 2019-04-09 | Google Llc | Using pressure sensor input to selectively route user inputs |
US10739984B1 (en) * | 2017-07-31 | 2020-08-11 | Amazon Technologies, Inc. | System for detection of input device |
US10976919B2 (en) * | 2017-09-14 | 2021-04-13 | Sap Se | Hybrid gestures for visualizations |
CN109683783A (zh) * | 2018-12-29 | 2019-04-26 | 联想(北京)有限公司 | 信息处理方法及电子设备 |
CN109969104A (zh) * | 2019-03-13 | 2019-07-05 | 百度在线网络技术(北京)有限公司 | 控制模式切换方法和装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040150668A1 (en) * | 2003-01-31 | 2004-08-05 | Xerox Corporation | Secondary touch contextual sub-menu navigation for touch screen interface |
US6938216B1 (en) * | 1999-02-12 | 2005-08-30 | Fujitsu Limited | Menu system requiring reduced user manipulation of an input device |
US20100105443A1 (en) * | 2008-10-27 | 2010-04-29 | Nokia Corporation | Methods and apparatuses for facilitating interaction with touch screen apparatuses |
US20100251112A1 (en) * | 2009-03-24 | 2010-09-30 | Microsoft Corporation | Bimodal touch sensitive digital notebook |
US20110050594A1 (en) * | 2009-09-02 | 2011-03-03 | Kim John T | Touch-Screen User Interface |
Family Cites Families (110)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3577957A (en) | 1969-09-09 | 1971-05-11 | Henry Sandig | Animal feeder having adjustable timing means |
US5819055A (en) | 1994-12-13 | 1998-10-06 | Microsoft Corporation | Method and apparatus for docking re-sizeable interface boxes |
US5644737A (en) | 1995-06-06 | 1997-07-01 | Microsoft Corporation | Method and system for stacking toolbars in a computer display |
US6493006B1 (en) | 1996-05-10 | 2002-12-10 | Apple Computer, Inc. | Graphical user interface having contextual menus |
JP3998376B2 (ja) | 1999-09-10 | 2007-10-24 | 富士通株式会社 | 入力処理方法及びそれを実施する入力処理装置 |
US6664991B1 (en) | 2000-01-06 | 2003-12-16 | Microsoft Corporation | Method and apparatus for providing context menus on a pen-based device |
US7450114B2 (en) | 2000-04-14 | 2008-11-11 | Picsel (Research) Limited | User interface systems and methods for manipulating and viewing digital documents |
WO2002033541A2 (fr) | 2000-10-16 | 2002-04-25 | Tangis Corporation | Determination dynamique d'interfaces utilisateur informatiques appropriees |
US7730401B2 (en) | 2001-05-16 | 2010-06-01 | Synaptics Incorporated | Touch screen with user interface enhancement |
US9098437B2 (en) | 2010-10-01 | 2015-08-04 | Z124 | Cross-environment communication framework |
US20030112585A1 (en) | 2001-12-13 | 2003-06-19 | Silvester Kelan Craig | Multiprocessor notebook computer with a tablet PC conversion capability |
US20030162523A1 (en) | 2002-02-27 | 2003-08-28 | Michael Kapolka | Vehicle telemetry system and method |
JP2003337728A (ja) | 2002-05-17 | 2003-11-28 | Hitachi Ltd | データファイル履歴管理方法およびその装置 |
US7058902B2 (en) | 2002-07-30 | 2006-06-06 | Microsoft Corporation | Enhanced on-object context menus |
US7952569B2 (en) * | 2002-08-08 | 2011-05-31 | Hewlett-Packard Development Company, L.P. | System and method of switching between multiple viewing modes in a multi-head computer system |
SE0202664L (sv) | 2002-09-09 | 2003-11-04 | Zenterio Ab | Grafiskt användargränssnitt för navigering och selektion från olika valbara alternativ presenterade på en bildskärm |
US20040255301A1 (en) | 2003-06-13 | 2004-12-16 | Andrzej Turski | Context association schema for computer system architecture |
US7210107B2 (en) | 2003-06-27 | 2007-04-24 | Microsoft Corporation | Menus whose geometry is bounded by two radii and an arc |
US7984129B2 (en) | 2003-07-11 | 2011-07-19 | Computer Associates Think, Inc. | System and method for high-performance profiling of application events |
US7418670B2 (en) | 2003-10-03 | 2008-08-26 | Microsoft Corporation | Hierarchical in-place menus |
US9202217B2 (en) | 2003-10-06 | 2015-12-01 | Yellowpages.Com Llc | Methods and apparatuses to manage multiple advertisements |
US7698654B2 (en) | 2004-01-05 | 2010-04-13 | Microsoft Corporation | Systems and methods for co-axial navigation of a user interface |
US20050179647A1 (en) * | 2004-02-18 | 2005-08-18 | Microsoft Corporation | Automatic detection and switching between input modes |
US7703036B2 (en) | 2004-08-16 | 2010-04-20 | Microsoft Corporation | User interface for displaying selectable software functionality controls that are relevant to a selected object |
US7895531B2 (en) | 2004-08-16 | 2011-02-22 | Microsoft Corporation | Floating command object |
US7557707B2 (en) | 2004-09-01 | 2009-07-07 | Microsoft Corporation | RFID enabled information systems utilizing a business application |
US20060161460A1 (en) | 2004-12-15 | 2006-07-20 | Critical Connection Inc. | System and method for a graphical user interface for healthcare data |
US7802202B2 (en) | 2005-03-17 | 2010-09-21 | Microsoft Corporation | Computer interaction based upon a currently active input device |
US7856602B2 (en) | 2005-04-20 | 2010-12-21 | Apple Inc. | Updatable menu items |
US8542196B2 (en) | 2005-07-22 | 2013-09-24 | Move Mobile Systems, Inc. | System and method for a thumb-optimized touch-screen user interface |
US7884836B2 (en) * | 2005-08-30 | 2011-02-08 | Ati Technologies Ulc | Notifying a graphics subsystem of a physical change at a display device |
US20070139386A1 (en) | 2005-12-16 | 2007-06-21 | Xerox Corporation | Touch screen user interface for digital reprographic device with pop-up menu display |
KR100792295B1 (ko) | 2005-12-29 | 2008-01-07 | 삼성전자주식회사 | 컨텐츠 네비게이션 방법 및 그 컨텐츠 네비게이션 장치 |
US20070162864A1 (en) | 2006-01-10 | 2007-07-12 | International Business Machines Corp. | User-directed repartitioning of content on tab-based interfaces |
US7770126B2 (en) | 2006-02-10 | 2010-08-03 | Microsoft Corporation | Assisting user interface element use |
US20070192714A1 (en) | 2006-02-13 | 2007-08-16 | Research In Motion Limited | Method and arrangement for providing a primary actions menu on a handheld communication device having a reduced alphabetic keyboard |
US20070238489A1 (en) | 2006-03-31 | 2007-10-11 | Research In Motion Limited | Edit menu for a mobile communication device |
US8296684B2 (en) * | 2008-05-23 | 2012-10-23 | Hewlett-Packard Development Company, L.P. | Navigating among activities in a computing device |
US7418453B2 (en) | 2006-06-15 | 2008-08-26 | International Business Machines Corporation | Updating a data warehouse schema based on changes in an observation model |
US7966558B2 (en) | 2006-06-15 | 2011-06-21 | Microsoft Corporation | Snipping tool |
US7930644B2 (en) | 2006-09-13 | 2011-04-19 | Savant Systems, Llc | Programming environment and metadata management for programmable multimedia controller |
US20080163121A1 (en) | 2006-12-29 | 2008-07-03 | Research In Motion Limited | Method and arrangement for designating a menu item on a handheld electronic device |
US8125457B2 (en) * | 2007-04-27 | 2012-02-28 | Hewlett-Packard Development Company, L.P. | Switching display mode of electronic device |
US8667418B2 (en) | 2007-06-08 | 2014-03-04 | Apple Inc. | Object stack |
US9086785B2 (en) | 2007-06-08 | 2015-07-21 | Apple Inc. | Visualization object receptacle |
US8201096B2 (en) | 2007-06-09 | 2012-06-12 | Apple Inc. | Browsing or searching user interfaces and other aspects |
US7720873B2 (en) | 2007-06-21 | 2010-05-18 | International Business Machines Corporation | Dynamic data discovery of a source data schema and mapping to a target data schema |
US8869065B2 (en) | 2007-06-29 | 2014-10-21 | Microsoft Corporation | Segment ring menu |
US8645863B2 (en) | 2007-06-29 | 2014-02-04 | Microsoft Corporation | Menus with translucency and live preview |
US8547246B2 (en) | 2007-10-09 | 2013-10-01 | Halliburton Energy Services, Inc. | Telemetry system for slickline enabling real time logging |
US9241063B2 (en) | 2007-11-01 | 2016-01-19 | Google Inc. | Methods for responding to an email message by call from a mobile device |
US20090125811A1 (en) * | 2007-11-12 | 2009-05-14 | Microsoft Corporation | User interface providing auditory feedback |
US8356258B2 (en) | 2008-02-01 | 2013-01-15 | Microsoft Corporation | Arranging display areas utilizing enhanced window states |
KR101012300B1 (ko) | 2008-03-07 | 2011-02-08 | 삼성전자주식회사 | 터치스크린을 구비한 휴대 단말기의 사용자 인터페이스장치 및 그 방법 |
US8577957B2 (en) | 2008-04-01 | 2013-11-05 | Litl Llc | System and method for streamlining user interaction with electronic content |
US10031549B2 (en) * | 2008-07-10 | 2018-07-24 | Apple Inc. | Transitioning between modes of input |
US8803816B2 (en) | 2008-09-08 | 2014-08-12 | Qualcomm Incorporated | Multi-fold mobile device with configurable interface |
US8321802B2 (en) | 2008-11-13 | 2012-11-27 | Qualcomm Incorporated | Method and system for context dependent pop-up menus |
US9600070B2 (en) | 2008-12-22 | 2017-03-21 | Apple Inc. | User interface having changeable topography |
US20100162165A1 (en) | 2008-12-22 | 2010-06-24 | Apple Inc. | User Interface Tools |
US20100207888A1 (en) | 2009-02-18 | 2010-08-19 | Mr. Noam Camiel | System and method for using a keyboard with a touch-sensitive display |
WO2010110550A1 (fr) * | 2009-03-23 | 2010-09-30 | Core Logic Inc. | Appareil et procédé de réalisation de clavier virtuel |
US20100238126A1 (en) | 2009-03-23 | 2010-09-23 | Microsoft Corporation | Pressure-sensitive context menus |
US8630088B2 (en) | 2009-03-27 | 2014-01-14 | Qualcomm Incorporated | Portable docking station for a portable computing device |
CN101866257B (zh) | 2009-04-20 | 2012-11-21 | 鸿富锦精密工业(深圳)有限公司 | 触控式手持设备及其选项显示方法 |
US8881013B2 (en) | 2009-04-30 | 2014-11-04 | Apple Inc. | Tool for tracking versions of media sections in a composite presentation |
US8477046B2 (en) | 2009-05-05 | 2013-07-02 | Advanced Technologies Group, LLC | Sports telemetry system for collecting performance metrics and data |
US8355007B2 (en) | 2009-05-11 | 2013-01-15 | Adobe Systems Incorporated | Methods for use with multi-touch displays for determining when a touch is processed as a mouse event |
US8352884B2 (en) | 2009-05-21 | 2013-01-08 | Sony Computer Entertainment Inc. | Dynamic reconfiguration of GUI display decomposition based on predictive model |
TW201042466A (en) | 2009-05-28 | 2010-12-01 | Inst Information Industry | Hybrid computer systems |
KR20100134948A (ko) | 2009-06-16 | 2010-12-24 | 삼성전자주식회사 | 터치스크린을 구비하는 장치의 메뉴 표시 방법 |
GB2473000B (en) | 2009-08-25 | 2014-02-19 | Promethean Ltd | Dynamic switching of interactive whiteboard data |
US8418079B2 (en) | 2009-09-01 | 2013-04-09 | James J. Nicholas, III | System and method for cursor-based application management |
KR20110047349A (ko) | 2009-10-30 | 2011-05-09 | 주식회사 팬택 | 휴대용 단말기에서 터치와 가압을 이용하는 사용자 인터페이스 장치 및 방법 |
US20110173533A1 (en) | 2010-01-09 | 2011-07-14 | Au Optronics Corp. | Touch Operation Method and Operation Method of Electronic Device |
KR101726599B1 (ko) | 2010-01-22 | 2017-04-14 | 제이씨텍(주) | 용기의 마개 |
CA2731772C (fr) | 2010-02-15 | 2014-08-12 | Research In Motion Limited | Menu contextuel graphique abrege |
US8866744B2 (en) * | 2010-03-30 | 2014-10-21 | Howay Corp. | Keyboard having touch input device |
US8631350B2 (en) | 2010-04-23 | 2014-01-14 | Blackberry Limited | Graphical context short menu |
KR20110121888A (ko) | 2010-05-03 | 2011-11-09 | 삼성전자주식회사 | 휴대용 단말기에서 팝업 메뉴를 확인하기 위한 장치 및 방법 |
CN103026345B (zh) | 2010-06-02 | 2016-01-20 | 惠普发展公司,有限责任合伙企业 | 用于事件监测优先级的动态多维模式 |
US20120050183A1 (en) | 2010-08-27 | 2012-03-01 | Google Inc. | Switching display modes based on connection state |
KR101685363B1 (ko) | 2010-09-27 | 2016-12-12 | 엘지전자 주식회사 | 휴대 단말기 및 그 동작 방법 |
US8442982B2 (en) | 2010-11-05 | 2013-05-14 | Apple Inc. | Extended database search |
US9292171B2 (en) | 2010-11-17 | 2016-03-22 | International Business Machines Corporation | Border menu for context dependent actions within a graphical user interface |
KR101932688B1 (ko) | 2010-11-29 | 2018-12-28 | 삼성전자주식회사 | 휴대기기 및 이에 적용되는 ui 모드 제공 방법 |
WO2012094740A1 (fr) | 2011-01-12 | 2012-07-19 | Smart Technologies Ulc | Procédé permettant de prendre en charge de multiples menus et système d'entrée interactif utilisant ce procédé |
US9645986B2 (en) | 2011-02-24 | 2017-05-09 | Google Inc. | Method, medium, and system for creating an electronic book with an umbrella policy |
US9285950B2 (en) | 2011-03-30 | 2016-03-15 | Google Inc. | Hover-over gesturing on mobile devices |
JP5121971B2 (ja) * | 2011-04-28 | 2013-01-16 | 株式会社東芝 | ドッキングステーションおよび電子機器 |
US9582187B2 (en) | 2011-07-14 | 2017-02-28 | Microsoft Technology Licensing, Llc | Dynamic context based menus |
US20130019175A1 (en) | 2011-07-14 | 2013-01-17 | Microsoft Corporation | Submenus for context based menu system |
US9013510B2 (en) | 2011-07-29 | 2015-04-21 | Google Inc. | Systems and methods for rendering user interface elements in accordance with a device type |
KR101833281B1 (ko) * | 2011-08-29 | 2018-02-28 | 삼성전자주식회사 | 전자기기에서 터치패드 오동작 방지 방법 및 장치 |
US10684768B2 (en) | 2011-10-14 | 2020-06-16 | Autodesk, Inc. | Enhanced target selection for a touch-based input enabled user interface |
US8707211B2 (en) | 2011-10-21 | 2014-04-22 | Hewlett-Packard Development Company, L.P. | Radial graphical user interface |
US20130174033A1 (en) | 2011-12-29 | 2013-07-04 | Chegg, Inc. | HTML5 Selector for Web Page Content Selection |
US20130191779A1 (en) | 2012-01-20 | 2013-07-25 | Microsoft Corporation | Display of user interface elements based on touch or hardware input |
US9928562B2 (en) | 2012-01-20 | 2018-03-27 | Microsoft Technology Licensing, Llc | Touch mode and input type recognition |
US20130191781A1 (en) | 2012-01-20 | 2013-07-25 | Microsoft Corporation | Displaying and interacting with touch contextual user interface |
KR20130095478A (ko) | 2012-02-20 | 2013-08-28 | 삼성전자주식회사 | 전자 장치, 그 제어 방법, 및 컴퓨터 판독가능 저장매체 |
DE112012006246T5 (de) | 2012-04-20 | 2015-04-02 | Intel Corporation | Hybrides Kommunikationssystem, Gerät und System |
KR101957173B1 (ko) | 2012-09-24 | 2019-03-12 | 삼성전자 주식회사 | 터치 디바이스에서 멀티윈도우 제공 방법 및 장치 |
US9105178B2 (en) | 2012-12-03 | 2015-08-11 | Sony Computer Entertainment Inc. | Remote dynamic configuration of telemetry reporting through regular expressions |
US20150220151A1 (en) | 2013-03-14 | 2015-08-06 | Scott Ronald Violet | Dynamically change between input modes based on user input |
US20140354554A1 (en) | 2013-05-30 | 2014-12-04 | Microsoft Corporation | Touch Optimized UI |
US9817851B2 (en) | 2014-01-09 | 2017-11-14 | Business Objects Software Ltd. | Dyanmic data-driven generation and modification of input schemas for data analysis |
US9471201B1 (en) | 2014-05-20 | 2016-10-18 | Google Inc. | Laptop-to-tablet mode adaptation |
US10048856B2 (en) | 2014-12-30 | 2018-08-14 | Microsoft Technology Licensing, Llc | Configuring a user interface based on an experience mode transition |
US20160209973A1 (en) | 2015-01-21 | 2016-07-21 | Microsoft Technology Licensing, Llc. | Application user interface reconfiguration based on an experience mode transition |
-
2012
- 2012-01-20 US US13/355,208 patent/US9928562B2/en active Active
-
2013
- 2013-01-17 WO PCT/US2013/021793 patent/WO2013109662A1/fr active Application Filing
-
2015
- 2015-08-31 US US14/841,679 patent/US9928566B2/en active Active
-
2018
- 2018-03-23 US US15/933,754 patent/US10430917B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6938216B1 (en) * | 1999-02-12 | 2005-08-30 | Fujitsu Limited | Menu system requiring reduced user manipulation of an input device |
US20040150668A1 (en) * | 2003-01-31 | 2004-08-05 | Xerox Corporation | Secondary touch contextual sub-menu navigation for touch screen interface |
US20100105443A1 (en) * | 2008-10-27 | 2010-04-29 | Nokia Corporation | Methods and apparatuses for facilitating interaction with touch screen apparatuses |
US20100251112A1 (en) * | 2009-03-24 | 2010-09-30 | Microsoft Corporation | Bimodal touch sensitive digital notebook |
US20110050594A1 (en) * | 2009-09-02 | 2011-03-03 | Kim John T | Touch-Screen User Interface |
Also Published As
Publication number | Publication date |
---|---|
US10430917B2 (en) | 2019-10-01 |
US9928566B2 (en) | 2018-03-27 |
US20180218476A1 (en) | 2018-08-02 |
US20130187855A1 (en) | 2013-07-25 |
US20150371358A1 (en) | 2015-12-24 |
US9928562B2 (en) | 2018-03-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10430917B2 (en) | Input mode recognition | |
US20140304648A1 (en) | Displaying and interacting with touch contextual user interface | |
US20130191779A1 (en) | Display of user interface elements based on touch or hardware input | |
US10324592B2 (en) | Slicer elements for filtering tabular data | |
RU2609099C2 (ru) | Настройка контента во избежание загораживания виртуальной панелью ввода | |
US8990686B2 (en) | Visual navigation of documents by object | |
US20130191785A1 (en) | Confident item selection using direct manipulation | |
US10108330B2 (en) | Automatic highlighting of formula parameters for limited display devices | |
US20130061122A1 (en) | Multi-cell selection using touch input | |
KR20140078629A (ko) | 인플레이스 방식으로 값을 편집하는 사용자 인터페이스 | |
US20130111333A1 (en) | Scaling objects while maintaining object structure | |
WO2013056346A1 (fr) | Dispositif électronique et procédé de commande de ce dispositif |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13738499 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13738499 Country of ref document: EP Kind code of ref document: A1 |