WO2013112354A1 - Confident item selection using direct manipulation - Google Patents

Confident item selection using direct manipulation Download PDF

Info

Publication number
WO2013112354A1
WO2013112354A1 PCT/US2013/022003 US2013022003W WO2013112354A1 WO 2013112354 A1 WO2013112354 A1 WO 2013112354A1 US 2013022003 W US2013022003 W US 2013022003W WO 2013112354 A1 WO2013112354 A1 WO 2013112354A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
displaying
selected area
items
visual indicator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2013/022003
Other languages
English (en)
French (fr)
Inventor
Benjamin Edward Rampson
Karen Cheng
Su-Piao Wu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Corp
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to EP13741294.6A priority Critical patent/EP2807543A4/en
Priority to JP2014554744A priority patent/JP2015512078A/ja
Priority to CN201380006411.5A priority patent/CN104067211A/zh
Priority to KR1020147020497A priority patent/KR20140114392A/ko
Publication of WO2013112354A1 publication Critical patent/WO2013112354A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • a user interface element and a visual indicator are displayed to show both a current selected area that tracks a user's touch input and an indication of any items that are considered to be selected (the potential selection).
  • the user interface element e.g. a border
  • the user interface element is displayed whose size may be adjusted by a user using touch input to select more/fewer items. For example, a user may select a corner of the user interface element and drag it to adjust the currently selected area.
  • An item visual indicator is displayed for items that are considered to be a potential selection (e.g. items that would be selected if the touch input were to end at the current time).
  • the potential selection of items may be based on a determination that the current selected area encompasses more than some predetermined area of an item.
  • the item visual indicator may distinguish all/portion of the items within the potential selection from other non-selected items.
  • the item visual indicator is configured to show the user an indication of currently selected items without the border appearing to jump in response to another item being selected/deselected.
  • the item visual indicator helps to provide the user with a clear and confident understanding of the selection that will be made helping to avoid the need for a user to re-adjust the selection or get unexpected results.
  • FIGURE 1 illustrates an exemplary computing environment
  • FIGURE 2 illustrates an exemplary system for selecting items using both a display of a currently selected area and an item visual indicator
  • FIGURE 3 shows a display illustrating a window that shows a user selecting cells within a spreadsheet
  • FIGURE 4 shows an illustrative processes for selecting items using touch input
  • FIGURES 5-7 illustrate exemplary windows showing a user selecting items
  • FIGURE 8 illustrates a system architecture used in selecting items.
  • FIGURE 1 and the corresponding discussion are intended to provide a brief, general description of a suitable computing environment in which embodiments may be implemented.
  • program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types.
  • Other computer system configurations may also be used, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like.
  • Distributed computing environments may also be used where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote memory storage devices.
  • FIGURE 1 an illustrative computer environment for a computer 100 utilized in the various embodiments will be described.
  • the computer environment shown in FIGURE 1 includes computing devices that each may be configured as a mobile computing device (e.g. phone, tablet, netbook, laptop), server, a desktop, or some other type of computing device and includes a central processing unit 5 ("CPU"), a system memory 7, including a random access memory 9 ("RAM”) and a read-only memory (“ROM”) 10, and a system bus 12 that couples the memory to the central processing unit (“CPU") 5.
  • CPU central processing unit
  • RAM random access memory 9
  • ROM read-only memory
  • the computer 100 further includes a mass storage device 14 for storing an operating system 16, application(s) 24 (e.g. productivity application, spreadsheet application, Web Browser, and the like) and selection manager 26 which will be described in greater detail below.
  • the mass storage device 14 is connected to the CPU 5 through a mass storage controller (not shown) connected to the bus 12.
  • the mass storage device 14 and its associated computer-readable media provide non-volatile storage for the computer 100.
  • computer-readable media can be any available media that can be accessed by the computer 100.
  • Computer-readable media may comprise computer storage media and communication media.
  • Computer storage media includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, Erasable Programmable Read Only Memory (“EPROM”), Electrically Erasable Programmable Read Only Memory (“EEPROM”), flash memory or other solid state memory technology, CD-ROM, digital versatile disks (“DVD”), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 100.
  • Computer 100 operates in a networked environment using logical connections to remote computers through a network 18, such as the Internet.
  • the computer 100 may connect to the network 18 through a network interface unit 20 connected to the bus 12.
  • the network connection may be wireless and/or wired.
  • the network interface unit 20 may also be utilized to connect to other types of networks and remote computer systems.
  • the computer 100 may also include an input/output controller 22 for receiving and processing input from a number of other devices, including a keyboard, mouse, a touch input device, or electronic stylus (not shown in FIGURE 1). Similarly, an input/output controller 22 may provide input/output to a display screen 23, a printer, or other type of output device.
  • a touch input device may utilize any technology that allows single/multi- touch input to be recognized (touching/non-touching).
  • the technologies may include, but are not limited to: heat, finger pressure, high capture rate cameras, infrared light, optic capture, tuned electromagnetic induction, ultrasonic receivers, transducer microphones, laser rangefinders, shadow capture, and the like.
  • the touch input device may be configured to detect near-touches (i.e. within some distance of the touch input device but not physically touching the touch input device).
  • the touch input device may also act as a display.
  • the input/output controller 22 may also provide output to one or more display screens 23, a printer, or other type of input/output device.
  • a camera and/or some other sensing device may be operative to record one or more users and capture motions and/or gestures made by users of a computing device. Sensing device may be further operative to capture spoken words, such as by a microphone and/or capture other inputs from a user such as by a keyboard and/or mouse (not pictured).
  • the sensing device may comprise any motion detection device capable of detecting the movement of a user.
  • a camera may comprise a
  • MICROSOFT KINECT® motion capture device comprising a plurality of cameras and a plurality of microphones.
  • Embodiments of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components/processes illustrated in the SOC.
  • SOC system-on-a-chip
  • FIGURES may be integrated onto a single integrated circuit.
  • a SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned") onto the chip substrate as a single integrated circuit.
  • processing units graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or "burned") onto the chip substrate as a single integrated circuit.
  • a number of program modules and data files may be stored in the mass storage device 14 and RAM 9 of the computer 100, including an operating system 16 suitable for controlling the operation of a computer, such as the WINDOWS PHONE 7®, WINDOWS 7®, or WINDOWS SERVER® operating system from MICROSOFT CORPORATION of Redmond, Washington.
  • the mass storage device 14 and RAM 9 may also store one or more program modules.
  • the mass storage device 14 and the RAM 9 may store one or more application programs, such as a spreadsheet application, word processing application and/or other applications.
  • the MICROSOFT OFFICE suite of applications is included.
  • the application(s) may be client based and/or web based.
  • a network service 27 may be used, such as: MICROSOFT WINDOWS LIVE, MICROSOFT OFFICE 365 or some other network based service.
  • Selection manager 26 is configured to display a user interface element (e.g. UI 28) and a visual indicator to show both a current selected area that tracks a user's touch input and an indication of any items that are considered to be selected as a result of the currently selected area.
  • selection manager 26 displays a user interface element (e.g. a border) that may be adjusted such that the size of the currently selected area changes in response to updated touch input (e.g. underneath a finger).
  • An item visual indicator is displayed that shows any item(s) that are within the current selected area that are potential selections. For example, when the current selected area as illustrated by the user interface element encompasses more than some predetermined area of an item, the display of the item may be changed (e.g.
  • the item visual indicator is configured to show the user an indication of currently selected items without the border appearing to jump in response to another item being
  • Selection manager 26 may be located externally from an application, e.g. a spreadsheet application or some other application, as shown or may be a part of an application. Further, all/some of the functionality provided by selection manager 26 may be located internally/externally from an application for which the user interface element is used for editing value(s) in place. More details regarding the selection manager are disclosed below.
  • FIGURE 2 illustrates an exemplary system for selecting items using both a display of a currently selected area and an item visual indicator.
  • system 200 includes service 210, selection manager 240, store 245, touch screen input device/display 250 (e.g. slate) and smart phone 230.
  • service 210 is a cloud based and/or enterprise based service that may be configured to provide productivity services (e.g. MICROSOFT OFFICE 365 or some other cloud based/online service that is used to interact with items (e.g. spreadsheets, documents, charts, and the like).
  • productivity services e.g. MICROSOFT OFFICE 365 or some other cloud based/online service that is used to interact with items (e.g. spreadsheets, documents, charts, and the like).
  • Functionality of one or more of the services/applications provided by service 210 may also be configured as a client based application.
  • a client device may include a spreadsheet application that performs operations relating to selecting items using touch input.
  • system 200 shows a productivity service, other services/applications may be configured to select items.
  • service 210 is a multi-tenant service that provides resources 215 and services to any number of tenants (e.g. Tenants 1-N).
  • multi-tenant service 210 is a cloud based service that provides resources
  • System 200 as illustrated comprises a touch screen input device/display 250 (e.g. a slate/tablet device) and smart phone 230 that detects when a touch input has been received (e.g. a finger touching or nearly touching the touch screen).
  • a touch input e.g. a finger touching or nearly touching the touch screen.
  • Any type of touch screen may be utilized that detects a user's touch input.
  • the touch screen may include one or more layers of capacitive material that detects the touch input.
  • the touch screen is configured to detect objects that in contact with or above a touchable surface.
  • IR Infrared
  • the touch screen may be configured to determine locations of where touch input is received (e.g. a starting point, intermediate points and an ending point). Actual contact between the touchable surface and the object may be detected by any suitable means, including, for example, by a vibration sensor or microphone coupled to the touch panel.
  • sensors to detect contact includes pressure-based mechanisms, micro-machined accelerometers, piezoelectric devices, capacitive sensors, resistive sensors, inductive sensors, laser vibrometers, and LED vibrometers.
  • touch screen input device/display 250 and smart phone 230 shows an exemplary display 252/232 of selectable items. Items and documents may be stored on a device (e.g. smart phone 230, slate 250 and/or at some other location (e.g. network store 245). Smart phone 230 shows a display 232 of a spreadsheet including cells arranged in rows and columns that are selectable. The items, such as the cells within a spreadsheet, may be displayed by a client based application and/or by a server based application (e.g. enterprise, cloud based).
  • Selection manager 240 is configured to perform operations relating to interacting with and selecting items. Items may be selected in response to touch input and/or other input. Generally, items that are selectable are discrete items such as cells, tables, pictures, words, and other objects that are individually selectable.
  • a user is in the process of selecting two cells using touch input.
  • the first cell selected includes the value "Chad Rothschiller” and the second cell that is partially selected includes the value "Chicken.”
  • a user selects an item.
  • the item may be selected using touch input and/or some other input method (e.g. keyboard, mouse, ).
  • user interface element 233 is initially displayed to show the selection.
  • a border is placed around the initially selected cell whose size is adjustable using touch input.
  • the user has selected user interface element 233 and is dragging the edge of the UI element 233 over the cell containing the value "Chicken.”
  • Item visual indicator 234 e.g.
  • a hash fill in this example shows the user which cells will be selected based on the current selected area as indicated by UI element 233 (the potential selection).
  • the item visual indicator 234 is displayed for any cell that is determined to be a potential selection (e.g. would be selected if the current touch input ended at the currently selected area of UI element 233).
  • an item is selected when more than a predetermined percentage of the item is selected (e.g. 0-100%).
  • item visual indicator 234 may be displayed for any item that is at least 50% enclosed by the currently selected area as indicated by UI element 233.
  • Other item visual indicators and UI elements may be displayed (See exemplary figures and discussion herein).
  • UI element 260 is a border that shows the currently selected area and item visual indicator 262 shows a potential selection.
  • item visual indicator 262 shows a dimmed border around the remaining portion of the cell including the value "Chicken.”
  • FIGURE 3 shows a display illustrating a window that shows a user selecting cells within a spreadsheet.
  • window 300 includes a display of a spreadsheet 315 comprising three columns and seven rows. More or fewer areas/items may be included within window 300.
  • Window 300 may be a window that is associated with a desktop application, a mobile application and/or a web-based application (e.g. displayed by a browser). For example, a web browser may access a spreadsheet service, an spreadsheet application on a computing device may be configured to select items from one or more different services, and the like.
  • a user 330 is in the process of selecting cells A3, A4, B3 and B4 by adjusting a size of UI element 332 using touch input.
  • the UI element 332 is sized by user 330 dragging a corner/edge of the UI element.
  • Item visual indicator 334 displays the items (in this case cells) that would be selected if the user stopped adjusting the size of UI element 332 and ended the touch input (the potential selection).
  • the potential selection in this example includes cells A3, A4, B3 and B4.
  • FIGURE 4 shows an illustrative processes for selecting items using touch input.
  • routines presented herein it should be appreciated that the logical operations of various embodiments are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system. The implementation is a matter of choice dependent on the performance requirements of the computing system implementing the invention.
  • process 400 moves to operation 410, where a user interface element (e.g. a selection border) is displayed that shows the currently selected area/item.
  • a border may be initially displayed around an item (e.g. a cell, chart, object, word, ...) in response to an initial selection.
  • One or more handles may/may not be displayed with the user interface element to adjust a size of the current selected area as shown by the user interface element. For example, a user may want to change the size of the selection to include more/less items.
  • touch input is received to adjust a size of the current selected area of the user interface element.
  • the touch input may be a user's finger(s), a pen input device, and/or some other device that interacts directly with a display/screen of a computing device.
  • the touch input may be a touch input gesture that selects and drags an edge/corner of the displayed user interface element to resize the user interface element.
  • the user interface element e.g. the selection border
  • the user interface element is updated during the touch event and appears to stay "pinned" under the user's finger such that the user is clearly able to see the currently selected area as defined by the user.
  • An item may be a potential selection based on various criteria. For example, an item may be considered a potential selection when a predetermined percentage of the item (e.g. 10%, 20%, >50%> ..) is contained within the currently selected area. According to an embodiment, an item is considered a potential selection as soon as the currently selected area includes any part of an item (e.g. a user adjusts the currently selected area to include a portion of another cell).
  • Flowing to decision operation 440 a determination is made as whether any items are potential selections. When one or more items is not a potential selection, the process flows to operation 460. When one or more items is a potential selection, the process flows to operation 450.
  • an item visual indicator is displayed that indicates each item that is determined to be a potential selection.
  • the item visual indicator may include different types of visual indicators.
  • the item visual indicator may include any one or more of the following: changing a shading of an item; showing a different border, changing a formatting of an item, displaying a message showing the potential selection, and the like.
  • the item visual indicator provides an indication to the user of any currently selected item(s) without changing the current selection border while a user is adjusting a selection border. In this way, the item visual indicator helps to provide the user with a clear and confident understanding of the selection that will be made helping to avoid the need for a user to re-adjust the selection or get unexpected results.
  • FIGURES 5-7 illustrate exemplary windows showing a user selecting items.
  • FIGURES 5-7 are for exemplary purpose and are not intended to be limiting.
  • FIGURE 5 shows displays for selecting cells within a spreadsheet.
  • window 510 and window 550 each display a spreadsheet 512 that shows a name column, a GPA column, and an exam date column in which a user has initially selected cell B3. More or fewer columns/areas may be included within windows 510 and 550.
  • a window may be a window that is associated with a desktop application, a mobile application and/or a web-based application (e.g. displayed by a browser). The window may be displayed on a limited display device (e.g. smart phone, tablet) or on a larger screen device.
  • selected cell B3 is displayed differently from the other cells of the spreadsheet to indicate to a user that the cell is currently selected. While cell B3 is shown as being highlighted, other display options may be used to indicate the cell is selected (e.g. border around cell, hashing, color changes, font changes and the like).
  • UI element 520 In response to receiving an input (e.g. touch input 530) to adjust a size of a currently selected area, UI element 520 is displayed. In the current example, UI element 520 is displayed as a highlighted rectangular region. Other methods of displaying a user interface element to show a currently selected area may be shown (e.g. changing font, placing a border around the item, changing a color of the item, and the like). When the user changes the size of UI element 520, the display of the UI element changes to show the change in size and follows the movement of user's 530 finger. As the user adjusts the size of the currently selected area, one or more items may be determined to be a potential selection.
  • an input e.g. touch input 530
  • Window 550 shows the user dragging a left edge of UI element 520 such that it encompasses over half of cell A3.
  • an item value indicator 522 is displayed to show the potential selection of the cell (in this example, cell A3).
  • a portion of the item e.g. cell A3 is displayed using a different fill method as compared to UI element 520.
  • the item value indicator 522 may also be shown using different methods (e.g. no alpha blending, different colors, each complete item that is a potential selection is displayed using the same formatting, ).
  • FIGURE 6 shows displays for selecting items within a spreadsheet. As illustrated, window 610 and window 650 each include a spreadsheet that currently shows a Grade column, a sex column, and a siblings column.
  • Window 610 shows a user adjusting a size of a user interface element 612 selection box.
  • the user interface element 612 is displayed as a border around the cell that adjusts in size in response to a user's touch input (e.g. user 530).
  • a user's touch input e.g. user 530.
  • an item visual selection 614 is displayed that indicates to the user that if the user were to end the current selection, any item that is indicated as a potential selection by the item visual selection 614 would be selected.
  • item visual selection 614 is displayed as a different line type as compared to the line type that is used to display the currently selected area.
  • Window 650 shows a user changing a size of UI selection element 652 to select items.
  • items e.g. cells F5 and F6
  • a formatting method 654 to show that the items have already been selected.
  • Items that have not been selected yet, but are considered potential selections e.g. cells E4, E5, E6 and F4 are illustrated as potential selection by the display of item visual selection 656 (e.g. corner brackets).
  • FIGURE 7 shows displays for selecting different items within a document.
  • window 710, window 720, window 730 and window 740 each include a display of a document that includes items that may be individually selected.
  • Window 710 shows a user selecting a social security number within the document.
  • a social security number within the document.
  • the item visual selection 712 shows the potential selection (e.g. the entire social security number).
  • Window 720 shows UI element 722 displayed in response to the entire selection of the social security number.
  • Window 730 shows a user selecting different words in the document. As the user adjusts the size of user interface element 732, the display is adjusted to show the currently selected area and any items that would be selected if the input were to end using the currently selected area. In the current example, the last portion of "Security" is shown as a potential selection using item visual selection 734.
  • Window 740 shows a user selecting the words "My Social Security.”
  • FIGURE 8 illustrates a system architecture used in selecting items, as described herein.
  • Content used and displayed by the application e.g. application 1020
  • the selection manager 26 may be stored at different locations.
  • application 1020 may use/store data using directory services 1022, web portals 1024, mailbox services 1026, instant messaging stores 1028 and social networking sites 1030.
  • the application 1020 may use any of these types of systems or the like.
  • a server 1032 may be used to access sources and to prepare and display electronic items.
  • server 1032 may access spreadsheet cells, objects, charts, and the like for application 1020 to display at a client (e.g. a browser or some other window).
  • server 1032 may be a web server configured to provide spreadsheet services to one or more users.
  • Server 1032 may use the web to interact with clients through a network 1008.
  • Server 1032 may also comprise an application program (e.g. a spreadsheet application). Examples of clients that may interact with server 1032 and a spreadsheet application include computing device 1002, which may include any general purpose personal computer, a tablet computing device 1004 and/or mobile computing device 1006 which may include smart phones. Any of these devices may obtain content from the store 1016.
  • an application program e.g. a spreadsheet application
  • Examples of clients that may interact with server 1032 and a spreadsheet application include computing device 1002, which may include any general purpose personal computer, a tablet computing device 1004 and/or mobile computing device 1006 which may include smart phones. Any of these devices may obtain content from the store 1016.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/US2013/022003 2012-01-23 2013-01-18 Confident item selection using direct manipulation Ceased WO2013112354A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP13741294.6A EP2807543A4 (en) 2012-01-23 2013-01-18 EASY SELECTION OF MULTIPLE ELEMENTS THROUGH DIRECT HANDLING
JP2014554744A JP2015512078A (ja) 2012-01-23 2013-01-18 直接操作を使用する確信的アイテム選択
CN201380006411.5A CN104067211A (zh) 2012-01-23 2013-01-18 使用直接操纵进行自信的项目选择
KR1020147020497A KR20140114392A (ko) 2012-01-23 2013-01-18 직접적인 조작을 이용한 확신있는 아이템 선택 기법

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/356,502 2012-01-23
US13/356,502 US20130191785A1 (en) 2012-01-23 2012-01-23 Confident item selection using direct manipulation

Publications (1)

Publication Number Publication Date
WO2013112354A1 true WO2013112354A1 (en) 2013-08-01

Family

ID=48798299

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/022003 Ceased WO2013112354A1 (en) 2012-01-23 2013-01-18 Confident item selection using direct manipulation

Country Status (6)

Country Link
US (1) US20130191785A1 (enExample)
EP (1) EP2807543A4 (enExample)
JP (1) JP2015512078A (enExample)
KR (1) KR20140114392A (enExample)
CN (1) CN104067211A (enExample)
WO (1) WO2013112354A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015179582A1 (en) * 2014-05-23 2015-11-26 Microsoft Technology Licensing, Llc Group selection initiated from a single item

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9256349B2 (en) * 2012-05-09 2016-02-09 Microsoft Technology Licensing, Llc User-resizable icons
US20140115725A1 (en) * 2012-10-22 2014-04-24 Crucialsoft Company File using restriction method, user device and computer-readable storage
US20150052465A1 (en) * 2013-08-16 2015-02-19 Microsoft Corporation Feedback for Lasso Selection
US10366156B1 (en) * 2013-11-06 2019-07-30 Apttex Corporation Dynamically transferring data from a spreadsheet to a remote applcation
US9575651B2 (en) * 2013-12-30 2017-02-21 Lenovo (Singapore) Pte. Ltd. Touchscreen selection of graphical objects
EP4036685A1 (en) 2014-06-27 2022-08-03 Apple Inc. Reduced size user interface
US10135905B2 (en) 2014-07-21 2018-11-20 Apple Inc. Remote user interface
WO2016022204A1 (en) 2014-08-02 2016-02-11 Apple Inc. Context-specific user interfaces
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
CN115665320B (zh) 2014-09-02 2024-10-11 苹果公司 电子设备、存储介质和用于操作电子设备的方法
JP2017527033A (ja) 2014-09-02 2017-09-14 アップル インコーポレイテッド ユーザ入力を受信するためのユーザインタフェース
US10254948B2 (en) 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
WO2016144385A1 (en) 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
WO2017030646A1 (en) 2015-08-20 2017-02-23 Apple Inc. Exercise-based watch face and complications
US10359924B2 (en) * 2016-04-28 2019-07-23 Blackberry Limited Control of an electronic device including display and keyboard moveable relative to the display
US12175065B2 (en) 2016-06-10 2024-12-24 Apple Inc. Context-specific user interfaces for relocating one or more complications in a watch or clock interface
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
DK179412B1 (en) 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
KR101956694B1 (ko) * 2017-09-11 2019-03-11 윤태기 드론 컨트롤러 및 그 제어 방법
US10613748B2 (en) * 2017-10-03 2020-04-07 Google Llc Stylus assist
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
EP3827323B1 (en) 2019-05-06 2023-12-13 Apple Inc. Restricted operation of an electronic device
US11960701B2 (en) 2019-05-06 2024-04-16 Apple Inc. Using an illustration to show the passing of time
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device
DK201970598A1 (en) 2019-09-09 2021-05-17 Apple Inc Techniques for managing display usage
EP4439263A3 (en) 2020-05-11 2024-10-16 Apple Inc. User interfaces for managing user interface sharing
US11372659B2 (en) 2020-05-11 2022-06-28 Apple Inc. User interfaces for managing user interface sharing
DK202070625A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11372486B1 (en) 2021-03-16 2022-06-28 Microsoft Technology Licensing, Llc Setting digital pen input mode using tilt angle
US11435893B1 (en) 2021-03-16 2022-09-06 Microsoft Technology Licensing, Llc Submitting questions using digital ink
US11526659B2 (en) 2021-03-16 2022-12-13 Microsoft Technology Licensing, Llc Converting text to digital ink
US11361153B1 (en) 2021-03-16 2022-06-14 Microsoft Technology Licensing, Llc Linking digital ink instances using connecting lines
US11875543B2 (en) 2021-03-16 2024-01-16 Microsoft Technology Licensing, Llc Duplicating and aggregating digital ink instances
US12182373B2 (en) 2021-04-27 2024-12-31 Apple Inc. Techniques for managing display usage
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time
US12493267B2 (en) 2022-01-24 2025-12-09 Apple Inc. User interfaces for indicating time
CN119364180B (zh) * 2024-12-26 2025-06-06 荣耀终端股份有限公司 目标区域确定方法、电子设备、芯片系统及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6734883B1 (en) * 2000-05-25 2004-05-11 International Business Machines Corporation Spinlist graphical user interface control with preview and postview
US20060224947A1 (en) * 2005-03-31 2006-10-05 Microsoft Corporation Scrollable and re-sizeable formula bar
KR100672605B1 (ko) * 2006-03-30 2007-01-24 엘지전자 주식회사 아이템 선택 방법 및 이를 위한 단말기
KR100774927B1 (ko) * 2006-09-27 2007-11-09 엘지전자 주식회사 이동통신 단말기, 메뉴 및 아이템 선택방법
KR20090085470A (ko) * 2008-02-04 2009-08-07 삼성전자주식회사 아이템 또는 바탕화면에서 복수의 터치방식을 감지하는터치 ui 제공방법 및 이를 적용한 멀티미디어 기기

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001236464A (ja) * 2000-02-25 2001-08-31 Ricoh Co Ltd 文字抽出方法、文字抽出装置及び記憶媒体
US6891551B2 (en) * 2000-11-10 2005-05-10 Microsoft Corporation Selection handles in editing electronic documents
US20040055007A1 (en) * 2002-09-13 2004-03-18 David Allport Point-based system and method for interacting with electronic program guide grid
JP4387242B2 (ja) * 2004-05-10 2009-12-16 株式会社バンダイナムコゲームス プログラム、情報記憶媒体及びゲーム装置
US7877685B2 (en) * 2005-12-29 2011-01-25 Sap Ag Persistent adjustable text selector
US7936341B2 (en) * 2007-05-30 2011-05-03 Microsoft Corporation Recognizing selection regions from multiple simultaneous inputs
US8423914B2 (en) * 2007-06-08 2013-04-16 Apple Inc. Selection user interface
US8650507B2 (en) * 2008-03-04 2014-02-11 Apple Inc. Selecting of text using gestures
TWI365397B (en) * 2008-03-17 2012-06-01 Acer Inc Multi-object direction touch selection method and device, electronic device, computer accessible recording media and computer program product
JP2010039606A (ja) * 2008-08-01 2010-02-18 Hitachi Ltd 情報管理システム、情報管理サーバ及び情報管理方法
US20100235734A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
JP5428436B2 (ja) * 2009-03-25 2014-02-26 ソニー株式会社 電子機器、表示制御方法およびプログラム
US8793611B2 (en) * 2010-01-06 2014-07-29 Apple Inc. Device, method, and graphical user interface for manipulating selectable user interface objects
US8786559B2 (en) * 2010-01-06 2014-07-22 Apple Inc. Device, method, and graphical user interface for manipulating tables using multi-contact gestures
US20130169669A1 (en) * 2011-12-30 2013-07-04 Research In Motion Limited Methods And Apparatus For Presenting A Position Indication For A Selected Item In A List

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6734883B1 (en) * 2000-05-25 2004-05-11 International Business Machines Corporation Spinlist graphical user interface control with preview and postview
US20060224947A1 (en) * 2005-03-31 2006-10-05 Microsoft Corporation Scrollable and re-sizeable formula bar
KR100672605B1 (ko) * 2006-03-30 2007-01-24 엘지전자 주식회사 아이템 선택 방법 및 이를 위한 단말기
EP1840717A1 (en) 2006-03-30 2007-10-03 LG Electronics Inc. Terminal and method for selecting displayed items
KR100774927B1 (ko) * 2006-09-27 2007-11-09 엘지전자 주식회사 이동통신 단말기, 메뉴 및 아이템 선택방법
KR20090085470A (ko) * 2008-02-04 2009-08-07 삼성전자주식회사 아이템 또는 바탕화면에서 복수의 터치방식을 감지하는터치 ui 제공방법 및 이를 적용한 멀티미디어 기기

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2807543A4

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015179582A1 (en) * 2014-05-23 2015-11-26 Microsoft Technology Licensing, Llc Group selection initiated from a single item
CN106415626A (zh) * 2014-05-23 2017-02-15 微软技术许可有限责任公司 从单个项目发起的组选择
US10409453B2 (en) 2014-05-23 2019-09-10 Microsoft Technology Licensing, Llc Group selection initiated from a single item
CN106415626B (zh) * 2014-05-23 2020-04-21 微软技术许可有限责任公司 从单个项目发起的组选择

Also Published As

Publication number Publication date
US20130191785A1 (en) 2013-07-25
KR20140114392A (ko) 2014-09-26
EP2807543A4 (en) 2015-09-09
JP2015512078A (ja) 2015-04-23
EP2807543A1 (en) 2014-12-03
CN104067211A (zh) 2014-09-24

Similar Documents

Publication Publication Date Title
US20130191785A1 (en) Confident item selection using direct manipulation
JP6165154B2 (ja) 仮想入力パネルによるオクリュージョンを回避するためのコンテンツ調整
US10705707B2 (en) User interface for editing a value in place
US10324592B2 (en) Slicer elements for filtering tabular data
US8990686B2 (en) Visual navigation of documents by object
US20130191781A1 (en) Displaying and interacting with touch contextual user interface
US20130191779A1 (en) Display of user interface elements based on touch or hardware input
US20130111333A1 (en) Scaling objects while maintaining object structure
HK1181158A (en) Adjusting content to avoid occlusion by a virtual input panel
HK1180058A (en) User interface for editing a value in place
HK1181163B (en) Visual navigation of documents by object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13741294

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2013741294

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20147020497

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2014554744

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE