US20150338939A1 - Ink Modes - Google Patents
Ink Modes Download PDFInfo
- Publication number
- US20150338939A1 US20150338939A1 US14/665,330 US201514665330A US2015338939A1 US 20150338939 A1 US20150338939 A1 US 20150338939A1 US 201514665330 A US201514665330 A US 201514665330A US 2015338939 A1 US2015338939 A1 US 2015338939A1
- Authority
- US
- United States
- Prior art keywords
- ink
- selection
- content
- pen
- input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04807—Pen manipulated menu
Definitions
- a particular device may receive input from a user via a keyboard, a mouse, voice input, touch input (e.g., to a touchscreen), and so forth.
- a touch instrument e.g., a pen, a stylus, a finger, and so forth
- the freehand input may be converted to a corresponding visual representation on a display, such as for taking notes, for creating and editing an electronic document, and so forth.
- Many current techniques for digital ink typically provide limited ink functionality.
- ink modes are supported. For instance, implementations support ink for selection, ink for commanding, ink for recognition, and so forth.
- a visual affordance of a particular active ink mode is presented on a document with which a user is interacting. For instance, the visual affordance is presented in response to detecting a proximity of a pen to an input surface such as a touch display.
- different ink modes each are associated with different respective visual affordances.
- FIG. 1 is an illustration of an environment in an example implementation that is operable to employ techniques discussed herein in accordance with one or more embodiments.
- FIG. 2 depicts an example implementation scenario for a permanent ink mode in accordance with one or more embodiments.
- FIG. 3 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments.
- FIG. 4 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments.
- FIG. 5 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments.
- FIG. 6 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments.
- FIG. 7 depicts an example implementation scenario for a multiple transient ink layers in accordance with one or more embodiments.
- FIG. 8 depicts an example implementation scenario for presenting an inking menu in accordance with one or more embodiments.
- FIG. 9 is a flow diagram that describes steps in a method for processing ink according to a current ink mode in accordance with one or more embodiments.
- FIG. 10 is a flow diagram that describes steps in a method for a transient ink timer in accordance with one or more embodiments.
- FIG. 11 is a flow diagram that describes steps in a method for propagating transient ink to different transient ink layers for different users in accordance with one or more embodiments.
- FIG. 12 is a flow diagram that describes steps in a method for presenting an ink menu in accordance with one or more embodiments.
- FIG. 13 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments.
- FIG. 14 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments.
- FIG. 15 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments.
- FIG. 16 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments.
- FIG. 17 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments.
- FIG. 18 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments.
- FIG. 19 is a flow diagram that describes steps in a method for ink for selection in accordance with one or more embodiments.
- FIG. 20 depicts an example implementation scenario for ink notes in accordance with one or more embodiments.
- FIG. 21 depicts an example implementation scenario for ink notes in accordance with one or more embodiments.
- FIG. 22 is a flow diagram that describes steps in a method for generating an ink note in accordance with one or more embodiments.
- FIG. 23 depicts an example implementation scenario for ink for commanding in accordance with one or more embodiments.
- FIG. 24 depicts an example implementation scenario for ink for commanding in accordance with one or more embodiments.
- FIG. 25 depicts an example implementation scenario for ink for commanding in accordance with one or more embodiments.
- FIG. 26 depicts an example implementation scenario for ink for commanding in accordance with one or more embodiments.
- FIG. 27 depicts an example implementation scenario for ink for commanding in accordance with one or more embodiments.
- FIG. 28 is a flow diagram that describes steps in a method for ink for commanding in accordance with one or more embodiments.
- FIG. 29 is a flow diagram that describes steps in a method for ink for commanding in accordance with one or more embodiments.
- FIG. 30 depicts an example implementation scenario for ink for shape recognition in accordance with one or more embodiments.
- FIG. 31 depicts an example implementation scenario for ink for text recognition in accordance with one or more embodiments.
- FIG. 32 depicts an example implementation scenario for ink for text recognition in accordance with one or more embodiments.
- FIG. 33 depicts an example implementation scenario for ink for text recognition in accordance with one or more embodiments.
- FIG. 34 depicts an example implementation scenario for ink for text recognition in accordance with one or more embodiments.
- FIG. 35 depicts an example implementation scenario for ink for text recognition in accordance with one or more embodiments.
- FIG. 36 is a flow diagram that describes steps in a method for ink for recognition in accordance with one or more embodiments.
- FIG. 37 is a flow diagram that describes steps in a method for ink for text recognition in accordance with one or more embodiments.
- FIG. 38 is a flow diagram that describes steps in a method for ink for character recognition in accordance with one or more embodiments.
- FIG. 39 illustrates an example system and computing device as described with reference to FIG. 1 , which are configured to implement embodiments of techniques described herein.
- ink refers to freehand input to a touch-sensing functionality such as a touchscreen, which is interpreted as digital ink, referred to herein as “ink.”
- Ink may be provided in various ways, such as using a pen (e.g., an active pen, a passive pen, and so forth), a stylus, a finger, and so forth.
- implementations support ink for selection, ink for commanding, ink for recognition, and so forth.
- Ink for selection provides different ways for utilizing ink to select objects, such as visual objects on a display. For instance, different ink gestures applied via freehand input using a pen are converted into different selection shapes for selecting objects. According to various implementations, ink for selection reduces an amount of time and a number of user interactions required to select an object.
- Ink for commanding provides different ways for utilizing ink to specify various commands to be performed. For instance, different ink commands applied via freehand input using a pen are recognized and automatically executed. According to various implementations, ink for commanding reduces an amount of time and a number of user interactions required to enter and execute commands. For instance, a user may simply write a command using ink, and the command is automatically recognized and executed without requiring the user to locate and select a visual control or menu item for the command.
- Ink for recognition provides different ways for recognizing and converting characters provided via freehand ink. For instance, different ink characters applied via freehand input using a pen are recognized and converted into encoded versions of different shapes and text characters. The shapes and text characters, for instance, are added to a primary content layer of a document. According to various implementations, ink for recognition reduces an amount of time and a number of user interactions required to generate encoded characters, such as shapes, text, and so forth.
- a visual affordance of a particular active ink mode is presented on a document with which a user is interacting. For instance, the visual affordance is presented in response to detecting a proximity of a pen to an input surface such as a touch display.
- different ink modes each are associated with different respective visual affordances.
- Example Environment is first described that is operable to employ techniques described herein.
- Example Implementation Scenarios and Procedures describes some example implementation scenarios and methods for ink modes in accordance with one or more embodiments.
- Example System and Device describes an example system and device that are operable to employ techniques discussed herein in accordance with one or more embodiments.
- FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ techniques for ink modes discussed herein.
- Environment 100 includes a client device 102 which can be embodied as any suitable device such as, by way of example and not limitation, a smartphone, a tablet computer, a portable computer (e.g., a laptop), a desktop computer, a wearable device, and so forth.
- the client device 102 represents a smart appliance, such as an Internet of Things (“IoT”) device.
- IoT Internet of Things
- the client device 102 may range from a system with significant processing power, to a lightweight device with minimal processing power.
- FIG. 39 One of a variety of different examples of a client device 102 is shown and described below in FIG. 39 .
- the client device 102 includes a variety of different functionalities that enable various activities and tasks to be performed.
- the client device 102 includes an operating system 104 , applications 106 , and a communication module 108 .
- the operating system 104 is representative of functionality for abstracting various system components of the client device 102 , such as hardware, kernel-level modules and services, and so forth.
- the operating system 104 can abstract various components (e.g., hardware, software, and firmware) of the client device 102 to the applications 106 to enable interaction between the components and the applications 106 .
- the applications 106 represents functionalities for performing different tasks via the client device 102 .
- Examples of the applications 106 include a word processing application, a spreadsheet application, a web browser, a gaming application, and so forth.
- the applications 106 may be installed locally on the client device 102 to be executed via a local runtime environment, and/or may represent portals to remote functionality, such as cloud-based services, web apps, and so forth.
- the applications 106 may take a variety of forms, such as locally-executed code, portals to remotely hosted services, and so forth.
- the communication module 108 is representative of functionality for enabling the client device 102 to communication over wired and/or wireless connections.
- the communication module 108 represents hardware and logic for communication via a variety of different wired and/or wireless technologies and protocols.
- the client device 102 further includes a display device 110 , input mechanisms 112 including a digitizer 114 and touch input devices 116 , and an ink module 118 .
- the display device 110 generally represents functionality for visual output for the client device 102 . Additionally, the display device 110 represents functionality for receiving various types of input, such as touch input, pen input, and so forth.
- the input mechanisms 112 generally represent different functionalities for receiving input to the computing device 102 . Examples of the input mechanisms 112 include gesture-sensitive sensors and devices (e.g., such as touch-based sensors and movement-tracking sensors (e.g., camera-based)), a mouse, a keyboard, a stylus, a touch pad, accelerometers, a microphone with accompanying voice recognition software, and so forth.
- gesture-sensitive sensors and devices e.g., such as touch-based sensors and movement-tracking sensors (e.g., camera-based)
- a mouse e.g., such as touch-based sensors and movement-tracking sensors (e.g
- the input mechanisms 112 may be separate or integral with the displays 110 ; integral examples include gesture-sensitive displays with integrated touch-sensitive or motion-sensitive sensors.
- the digitizer 114 represents functionality for converting various types of input to the display device 110 and the touch input devices 116 into digital data that can be used by the computing device 102 in various ways, such as for generating digital ink.
- the ink module 118 represents functionality for performing various aspects of techniques for ink modes discussed herein. Various functionalities of the ink module 118 are discussed below.
- the ink module 118 includes a transient layer application programming interface (API) 120 and a permanent layer API 122 .
- the transient layer API 120 represents functionality for enabling interaction with a transient ink layer
- the permanent layer API 122 represents functionality for enabling ink interaction with a permanent object (e.g., document) layer.
- the transient layer API 120 and the permanent layer API 122 may be utilized (e.g., by the applications 106 ) to access transient ink functionality and permanent ink functionality, respectively.
- the environment 100 further includes a pen 124 , which is representative of an input device for providing input to the display device 110 .
- the pen 124 is in a form factor of a traditional pen but includes functionality for interacting with the display device 110 and other functionality of the client device 102 .
- the pen 124 is an active pen that includes electronic components for interacting with the client device 102 .
- the pen 124 for instance, includes a battery that can provide power to internal components of the pen 124 .
- the pen 124 may include a magnet or other functionality that supports hover detection over the display device 110 .
- the pen 124 may be passive, e.g., a stylus without internal electronics.
- the pen 124 is representative of an input device that can provide input that can be differentiated from other types of input by the client device 102 .
- the digitizer 114 is configured to differentiate between input provided via the pen 124 , and input provided by a different input mechanism such as a user's finger, a stylus, and so forth.
- the pen 124 includes a pen mode button 126 , which represents a selectable control (e.g., a switch) for switching the pen 124 between different pen input modes.
- a selectable control e.g., a switch
- different pen input modes enable input from the pen 124 to be utilized and/or interpreted by the ink module 118 in different ways. Examples of different pen input modes are detailed below.
- ink can be applied in different ink modes including a transient ink mode and a permanent ink mode.
- transient ink refers to ink that is temporary and that can be used for various purposes, such as invoking particular actions, annotating a document, and so forth.
- ink can be used for annotation layers for electronic documents, temporary visual emphasis, text recognition, invoking various commands and functionalities, and so forth.
- Permanent ink generally refers to implementations where ink becomes a part of the underlying object, such as for creating a document, writing on a document (e.g., for annotation and/or editing), applying ink to graphics, and so forth.
- Permanent ink for example, can be considered as a graphics object, such as for note taking, for creating visual content, and so forth.
- a pen (e.g., the pen 124 ) applies ink whenever the pen is in contact with an input surface, such as the display device 104 and/or other input surface.
- a pen can apply ink across many different applications, platforms, and services.
- an application and/or service can specify how ink is used in relation to an underlying object, such as a word processing document, a spreadsheet and so forth. For instance, in some scenarios ink is applied as transient ink, and other scenarios ink is applied as permanent ink. Examples of different implementations and attributes of transient ink and permanent ink are detailed below.
- the implementation scenarios and procedures may be implemented in the environment 100 described above, the system 3900 of FIG. 39 , and/or any other suitable environment.
- the implementation scenarios and procedures describe example operations of the client device 102 and the ink module 118 . While the implementation scenarios and procedures are discussed with reference to a particular application, it is to be appreciated that techniques for ink modes discussed herein are applicable across a variety of different applications, services, and environments. In at least some embodiments, steps described for the various procedures are implemented automatically and independent of user interaction.
- FIG. 2 depicts an example implementation scenario 200 for a permanent ink mode in accordance with one or more implementations.
- the upper portion of the scenario 200 includes a graphical user interface (GUI) 202 displayed on the display 110 .
- GUI graphical user interface
- the GUI 202 represents a GUI for a particular functionality, such as an instance of the applications 106 .
- a user holding the pen 124 Displayed within the GUI 202 is a document 204 , e.g., an electronic document generated via one of the applications 106 .
- the user brings the pen 124 in proximity to the surface of the display 110 and within the GUI 202 .
- the pen 124 for instance, is placed within a particular distance of the display 110 (e.g., less than 2 centimeters) but not in contact with the display 110 . This behavior is generally referred to herein as “hovering” the pen 124 .
- a hover target 206 is displayed within the GUI 202 and at a point within the GUI 202 that is directly beneath the tip of the pen 124 .
- the hover target 206 represents a visual affordance that indicates that ink functionality is active such that a user may apply ink to the document 204 .
- the visual appearance (e.g., shape, color, shading, and so forth) of the hover target 206 provides a visual cue indicating a current ink mode that is active.
- the hover target is presented as a solid circle, which indicates that a permanent ink mode is active.
- the ink will become part of the document 204 , e.g., will be added to a primary content layer of the document 204 .
- the text e.g., primary content
- ink applied in a permanent ink mode represents a permanent ink layer that is added to a primary content layer of the document 204 .
- an ink flag 208 is visually presented adjacent to and/or at least partially overlaying a portion of the document 204 .
- the ink flag 208 represents a visual affordance that indicates that ink functionality is active such that a user may apply ink to the document 204 .
- the ink flag 208 may be presented additionally or alternatively to the hover target 206 .
- the ink flag 208 includes a visual cue indicating a current ink mode that is active.
- the ink flag 208 includes a solid circle, which indicates that a permanent ink mode is active.
- the ink flag 208 is selectable to cause an ink menu to be displayed that includes various ink-related functionalities, options, and settings that can be applied.
- FIG. 3 depicts an example implementation scenario 300 for a transient ink mode in accordance with one or more implementations.
- the upper portion of the scenario 300 includes a graphical user interface (GUI) 302 displayed on the display 110 .
- GUI graphical user interface
- the GUI 302 represents a GUI for a particular functionality, such as an instance of the applications 106 .
- Displayed within the GUI 302 is a document 304 , e.g., an electronic document generating via one of the applications 106 .
- the document 304 includes primary content 306 , which represents content generated as part of a primary content layer for the document 304 .
- the document 304 is a text-based document, and thus the primary content 306 includes text that is populated to the document.
- Various other types of documents and primary content may be employed, such as for graphics, multimedia, web content, and so forth.
- a user is hovering the pen 124 within a certain proximity of the surface of the display 110 , such as discussed above with reference to the scenario 200 .
- a hover target 308 is displayed within the document 304 and beneath the tip of the pen.
- the hover target 308 is presented as a hollow circle, thus indicating that a transient ink mode is active. For instance, if the user proceeds to apply ink to the document 304 , the ink will behave according to a transient ink mode. Examples of different transient ink behaviors are detailed elsewhere herein.
- an ink flag 310 is presented.
- the ink flag 310 includes a hollow circle 312 , thus providing a visual cue that a transient ink mode is active.
- the user removes the pen 124 from proximity to the display 110 .
- the hover target 308 and the ink flag 310 are removed from the display 110 .
- a hover target and/or an ink flag are presented when the pen 124 is detected as being hovered over the display 110 , and are removed from the display 110 when the pen 124 is removed such that the pen 124 is no longer detected as being hovered over the display 110 .
- an ink flag may be persistently displayed to indicate that inking functionality is active and/or available.
- FIG. 4 depicts an example implementation scenario 400 for a transient ink mode in accordance with one or more implementations.
- the upper portion of the scenario 300 includes the GUI 302 with the document 304 (introduced above) displayed on the display 110 .
- the scenario 400 represents an extension of the scenario 300 , above.
- a user applies ink content 402 to the document 304 using the pen 124 .
- the ink content 402 corresponds to an annotation of the document 402 .
- a variety of different types of transient ink other than annotations may be employed.
- a hover target is not displayed. For instance, in at least some implementations when the pen 124 transitions from a hover position to contact with the display 110 , a hover target is removed.
- the ink flag 310 includes a hollow circle 312 , indicating that the ink content 402 is applied according to a transient ink mode.
- an ink timer 406 begins running.
- the ink timer 406 begins counting down from a specific time value, such as 30 seconds, 60 seconds, and so forth.
- the ink timer is representative of functionality to implement a countdown function, such as for tracking time between user interactions with the display 110 via the pen 124 .
- the ink timer 406 represents a functionality of the ink module 118 .
- the hollow circle 312 begins to unwind, e.g., begins to disappear from the ink flag 310 .
- the hollow circle 312 unwinds at a rate that corresponds to the countdown of the ink timer 406 . For instance, when the ink timer 406 is elapsed by 50%, then 50% of the hollow circle 312 is removed from the ink flag 310 .
- unwinding of the hollow circle 312 provides a visual cue that the ink timer 406 is elapsing, and how much of the ink timer has elapsed and/or remains to be elapsed.
- the ink timer 406 if the ink timer 406 is elapsing as in the lower portion of the scenario 400 and the user proceeds to place the pen 124 in proximity to the display 110 (e.g., hovered or in contact with the display 110 ), the ink timer 406 will reset and will not begin elapsing again until the user removes the pen 124 from the display 110 such that the pen 124 is not detected. In such implementations, the hollow circle 312 will be restored within the ink flag 310 as in the upper portion of the scenario 400 .
- FIG. 5 depicts an example implementation scenario 500 for a transient ink mode in accordance with one or more implementations.
- the upper portion of the scenario 300 includes the GUI 302 with the document 304 (introduced above) displayed on the display 110 .
- the scenario 500 represents an extension of the scenario 400 , above.
- the ink timer 406 has elapsed. For instance, notice that the hollow circle 312 has completely unwound within the ink flag 310 , e.g., is visually removed from the ink flag 310 . According to various implementations, this provides a visual cue that the ink timer 406 has completely elapsed.
- the ink content 402 is removed from the GUI 302 and saved as part of a transient ink layer 504 for the document 304 .
- the ink flag 310 is populated with a user icon 502 .
- the user icon 502 represents a user that is currently logged in to the computing device 102 , and/or a user that is interacting with the document 304 .
- the pen 124 includes user identification data that is detected by the computing device 102 and thus is leveraged to track which user is interacting with the document 304 .
- the pen 124 includes a tagging mechanism (e.g., a radio-frequency identifier (RFID) chip) embedded with a user identity for a particular user.
- RFID radio-frequency identifier
- the tagging mechanism is detected by the computing device 102 and utilized to attribute ink input and/or other types of input to a particular user.
- the term “user” may be used to refer to an identity for an individual person, and/or an identity for a discrete group of users that are grouped under a single user identity.
- population of the user icon 502 to the ink flag 310 represents a visual indication that the transient ink layer 504 exists for the document 304 , and that the transient ink layer 504 is associated with (e.g., was generated by) a particular user.
- the transient ink layer 504 represents a data layer that is not part of the primary content layer of the document 304 , but that is persisted and can be referenced for various purposes. Further attributes of transient ink layers are described elsewhere herein.
- FIG. 6 depicts an example implementation scenario 600 for a transient ink mode in accordance with one or more implementations.
- the upper portion of the scenario 600 includes the GUI 302 with the document 304 (introduced above) displayed on the display 110 .
- the scenario 600 represents an extension of the scenario 500 , above.
- the ink flag 310 is displayed indicating that a transient ink layer (e.g., the transient ink layer 504 ) exists for the document 304 , and that the transient ink layer is linked to a particular user represented by the user icon 502 in the ink flag 310 .
- a transient ink layer e.g., the transient ink layer 504
- a user selects the ink flag 310 with the pen 124 , which causes the ink content 402 to be returned to display as part of the document 304 .
- the ink content 402 for example, is bound to the transient ink layer 504 , along with other transient ink content generated for the transient ink layer 504 .
- the transient ink layer 504 is accessible by various techniques, such as by selection of the ink flag 310 .
- transient ink content of the transient ink layer 504 is bound (e.g., anchored) to particular portions (e.g., pages, lines, text, and so forth) of the document 304 .
- the user generated the ink content 402 adjacent to a particular section of text.
- the ink content 402 is displayed adjacent to the particular section of text.
- the transient ink layer 504 is cumulative such that a user may add ink content to and remove ink content from the transient ink layer 504 over a span of time and during multiple different interactivity sessions.
- the transient ink layer 504 generally represents a record of multiple user interactions with the document 304 , such as for annotations, proofreading, commenting, and so forth.
- multiple transient layers may be created for the document 304 , such as when significant changes are made to the primary content 306 , when other users apply transient ink to the document 304 , and so forth.
- the ink timer 406 begins elapsing such as discussed above with reference to the scenarios 400 , 500 . Accordingly, the scenario 600 may return to the scenario 400 .
- FIG. 7 depicts an example implementation scenario 700 for a multiple transient ink layers in accordance with one or more implementations.
- the upper portion of the scenario 600 includes the GUI 302 with the document 304 (introduced above) displayed on the display 110 .
- the scenario 700 represents an extension of the scenario 600 , above.
- the ink flag 310 with the user icon 502 Displayed as part of the GUI 302 is the ink flag 310 with the user icon 502 , along with an ink flag 702 with a user icon 704 , and an ink flag 706 with a user icon 708 .
- the ink flags 702 , 706 represent other users that have interacted with the document 304
- the user icons 704 , 708 represent users associated with their respective ink flags.
- each individual ink flag represents a different respective user.
- the scenario 700 further includes the transient ink layer 504 associated with the ink flag 310 , along with a transient ink layer 710 linked to the ink flag 702 , and a transient ink layer 712 linked to the ink flag 706 .
- the transient ink layers 710 , 712 represent individual transient ink layers that are bound to individual user identities. Each individual transient ink layer 504 , 710 , 712 is individually accessible can be viewed and edited separately.
- the multiple of the transient ink layers 504 , 710 , 712 can be invoked such that ink content for the multiple layers is displayed concurrently as part of the document 304 .
- Example ways of invoking a transient ink layer are detailed elsewhere herein. Further, transient ink layer behaviors discussed elsewhere herein are applicable to the scenario 700 .
- FIG. 8 depicts an example implementation scenario 800 for presenting an inking menu in accordance with one or more implementations.
- the upper portion of the scenario 800 includes the GUI 302 with the document 304 (introduced above) displayed on the display 110 .
- the scenario 800 represents an extension of the scenarios discussed above.
- the user selects the ink flag 310 while the transient layer 504 is active. For instance, the user manipulates the pen 124 to first tap the ink flag 310 , which invokes the transient ink layer 504 such that the ink content 402 is retrieved and displayed, and then taps the ink flag 310 a second time within a particular period of time, e.g., 3 seconds.
- the ink flag 310 expands to present an ink menu 802 .
- the ink menu 802 includes multiple selectable indicia that are selectable to cause different ink-related actions to be performed, such as to apply and/or change various settings, invoke various functionalities, and so forth.
- an expanded representation 802 a of the ink menu 802 is depicted. The example visual indicia included in the ink menu 802 are now discussed in turn.
- Play control 804 when ink content (e.g., transient and/or permanent ink) is applied to a document, application of the ink content is recorded in real-time. For instance, application of ink content is recorded as an animation that shows the ink content being applied to a document as it was initially applied by a user. Accordingly, selection of the play control 804 causes a playback of ink content as it was originally applied. Further details concerning ink playback are presented below.
- ink content e.g., transient and/or permanent ink
- Transient Ink Control 806 selection of this control causes a transition from a different ink mode (e.g., a permanent ink mode) to a transient ink mode.
- a different ink mode e.g., a permanent ink mode
- Permanent Ink Control 808 selection of this control causes a transition from a different ink mode (e.g., a transient ink mode) to a permanent ink mode.
- a different ink mode e.g., a transient ink mode
- Text Recognition Control 810 selection of this control causes a transition to a text recognition mode. For instance, in a text recognition mode, characters applied using ink are converted into machine-encoded text.
- Shape Recognition Control 812 selection of this control causes a transition to a shape recognition mode. For instance, in a shape recognition mode, shapes applied using ink are converted into machine-encoded shapes, such as quadrilaterals, triangles, circles, and so forth.
- Selection Mode Control 814 selection of this control causes a transition to a selection mode.
- a selection mode input from a pen is interpreted as a selection action, such as to select text and/or other objects displayed in a document.
- Erase Mode Control 816 selection of this control causes a transition to an erase mode.
- an erase mode input from a pen is interpreted as an erase action, such as to erase ink, text, and/or other objects displayed in a document.
- Command Control 818 selection of this control causes a transition to a command mode. For instance, in a command mode, input from a pen is interpreted as a command to perform a particular action and/or task.
- Color Control 820 selection of this control enables a user to change an ink color that is applied to a document. For example, selection of this control causes a color menu to be presented that includes multiple different selectable colors. Section of a color from the color menu specifies the color for ink content that is applied to a document.
- Ink Note Control 822 this control is selectable to invoke ink note functionality, such as to enable ink content to be propagated to a note Ink note functionality is described in more detail below.
- Emphasis Control 824 selection of this control causes a transition from a different ink mode (e.g., a permanent or transient ink mode) to an emphasis ink mode.
- a different ink mode e.g., a permanent or transient ink mode
- ink is temporary and fades and disappears after a period of time.
- Emphasis ink for example, is not saved as part of primary content or a transient ink layer, but is used for temporary purposes, such as for visually identifying content, emphasizing content, and so forth.
- Pin Control 826 this control is selectable to pin a transient ink layer to a document, and to unpin the transient ink layer from the document. For instance, selecting the pin control 826 causes transient ink of a transient ink layer to be persistently displayed as part of a document. With reference to the scenario 500 , for example, selection of the pin control 826 prevents the ink timer 406 from being initiated when a user removes the pen 124 from proximity to the display 110 .
- the pin control 826 is also selectable to unpin a transient ink layer from a document. For instance, with reference to the scenario 500 , selection of the pin control 826 unpins a transient ink layer such that the ink timer 406 begins to elapse when a user removes the pen 124 from proximity to the display 110 .
- the pin control 826 when the user selects the ink flag 310 to cause the ink menu 802 to be presented, the pin control 826 is presented within the ink menu 802 in the same region of the display in which the user icon 502 is displayed in the ink flag 310 .
- a user can double-tap on the same spot in the over the ink flag 310 to cause the ink menu to be presented, and to pin and unpin transient ink from the document 304 .
- the visuals presented for the individual controls represent hover targets that are displayed when the respective modes are active.
- the example implementation scenarios above depict examples of hover targets for transient and permanent ink modes, and similar scenarios apply for other ink modes and the visuals displayed for their respective controls in the ink menu 802 .
- providing input outside of the ink menu 802 causes the ink menu 802 to collapse. For instance, if the user taps the pen 124 in the GUI 302 outside of the ink menu 802 , the ink menu 802 collapses such that the ink flag 310 is again displayed.
- FIG. 9 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for processing ink according to a current ink mode in accordance with one or more embodiments.
- Step 900 detects a pen in proximity to an input surface.
- the touch input device 116 detects that the pen 124 is hovered and/or in contact with the touch input device 116 .
- a hover operation can be associated with a particular threshold proximity to an input surface such that hovering the pen 124 at or within the threshold proximity to the input surface is interpreted as a hover operation, but placing the pen 124 farther than the threshold proximity from the input surface is not interpreted as a hover operation.
- Step 902 ascertains a current ink mode.
- the ink module 118 for example, ascertains an ink mode that is currently active on the computing device 102 .
- Examples of different ink modes are detailed elsewhere herein, and include a permanent ink mode, a transient ink mode, a text recognition mode, a shape recognition mode, a selection mode, an erase mode, a command mode, and so forth.
- a current ink mode may be automatically selected by the ink module 118 , such as based on an application and/or document context that is currently in focus. For instance, an application 106 may specify a default ink mode that is to be active for the application. Further, some applications may specify ink mode permissions that indicate allowed and disallowed ink modes. A particular application 106 , for example, may specify that a permanent ink mode is not allowed for documents presented by the application, such as to protect documents from being edited.
- a current ink mode is user-selectable, such as in response to user input selecting an ink mode from the ink menu 802 .
- a user may cause a switch from a default ink mode for an application to a different ink mode.
- Step 904 causes a visual affordance identifying the current ink mode to be displayed.
- Examples of such an affordance include a hover target, a visual included as part of an ink flag and/or ink-related menu, and so forth. Examples of different visual affordances are detailed throughout this description and the accompanying drawings.
- Step 906 processes ink content applied to the input surface according to the current ink mode.
- the ink content for instance, is processed as permanent ink, transient ink, and so forth.
- a permanent ink mode is active, the ink content is saved as permanent ink, such as part of a primary content layer of a document.
- the transient ink mode is active, the ink content is propagated to a transient ink layer of a document. Examples of different mode-specific ink behaviors and actions are detailed elsewhere herein.
- FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for a transient ink timer in accordance with one or more implementations.
- the method represents an extension of the method described above with reference to FIG. 9 .
- Step 1000 receives ink content applied to a document via input from a pen to an input surface while in a transient ink mode.
- the ink module 118 processes ink content received from the pen 124 to the display 110 as transient ink.
- Step 1002 detects that the pen is removed from proximity to the input surface. For instance, the touch input device 116 detects that the pen 124 is not in contact with and is not hovering over a surface of the touch input device 116 , e.g., the display 110 .
- Step 1004 initiates a timer.
- the timer for example, is initiated in response to detecting that the pen is removed from proximity to the input surface.
- a visual representation of the timer is presented.
- the visual representation provides a visual cue that the timer is elapsing, and indicates a relative amount (e.g., percentage) of the timer that has elapsed.
- the visual representation for example, is animated to visually convey that the timer is elapsing.
- a visual representation of a timer is discussed above with reference to FIGS. 4 and 5 .
- Step 1006 ascertains whether the pen is detected at the input surface before the timer expires. For instance, the ink module 118 ascertains whether the pen 124 is detected is contact with and/or hovering over the touch input device 116 prior to expiry of the timer. If the pen is detected at the input surface prior to expiry of the timer (“Yes”), step 1008 resets the timer and the process returns to step 1000 .
- step 1010 removes the ink content from the document and propagates the ink content to a transient layer for the document. For instance, response to expiry of the timer, the transient ink content is removed from display and propagated to a transient data layer for the document that is separate from a primary content layer of the document.
- a new transient ink layer is created for the document, and the transient ink content is propagated to the new transient ink layer.
- the transient ink content is propagated to an existing transient ink layer.
- the transient ink layer may represent an accumulation of transient ink provided by a user over multiple different interactions with the document and over a period of time.
- the transient ink layer may be associated with a particular user, e.g., a user that applies the transient ink content to the document.
- the transient ink is linked to the particular user and may subsequently be accessed by the user.
- FIG. 11 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for propagating transient ink to different transient ink layers for different users in accordance with one or more implementations.
- the method represents an extension of the methods described above with reference to FIGS. 9 and 10 .
- Step 1100 receives transient ink content to a document from multiple different users.
- the transient ink content for instance, is received during different interactivity sessions with the document that are individually associated with a different user.
- the document for example, a shared among different users, such as part of a group collaboration on the document.
- Step 1102 propagates transient ink content from each user to a different respective transient ink layer for the document.
- a different transient ink layer for example, is generated for each user, and transient ink content applied by each user is propagated to a respective transient ink layer for each user.
- Step 1104 causes visual affordances of the different transient ink layers to be displayed.
- Each transient ink layer for example, is represented by a visual affordance that visually identifies the transient ink layer and a user linked to the transient ink layer. Examples of such affordances are discussed above with reference to ink flags.
- Step 1106 enables each transient ink layer to be individually accessible.
- the ink module 118 enables each transient ink layer to be accessed (e.g., displayed) separately from the other transient ink layers.
- a transient ink layer is accessible by selecting a visual affordance that represents the transient ink layer. Further, multiple transient ink layers may be accessed concurrently, such as by selecting visual affordances that identify the transient ink layers.
- transient ink layer may be accessible in various other ways and separately from a document to which the transient ink layer is bound. For instance, a transient ink layer may be printed, shared (e.g., emailed) separately from a document for which the transient ink layer is created, published to a website, and so forth.
- FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for presenting an ink menu in accordance with one or more implementations.
- the method represents an extension of the methods described above.
- Step 1200 detects an action to invoke an ink menu.
- the ink module 118 detects that a user requests an ink menu. For instance, as described in the scenario 800 , a user may select a visual control (e.g., the ink flag 310 ) to request an ink menu.
- a visual control e.g., the ink flag 310
- an ink menu can be automatically invoked in response to various events, such as the pen 124 detected in proximity to the surface of the display surface, an ink-related application and/or service being launched, an application and/or service querying a user to select an ink mode, and so forth.
- Step 1202 causes the ink menu to be presented.
- the ink module 118 causes the ink menu 802 to be displayed.
- a default set of functionalities is associated by the ink module 118 with the ink menu 802 . Accordingly, different applications may modify the default set of functionalities, such as by adding a functionality to the default set of functionalities, removing a functionality from the default set of functionalities, and so forth.
- the ink menu 802 is populated with a set of functionalities (e.g., selectable controls) based on a customized set of functionalities specified by and/or for an application that is currently in focus.
- Step 1204 receives user input to the ink menu.
- the ink module 118 detects that a user manipulates the pen 124 to select a control displayed as part of the ink menu 802 .
- Step 1206 performs an action in response to the user input.
- the ink module 118 for instance, causes an action to be performed based on which control is selected by the user. Examples of such actions include changing an ink mode, initiating ink playback, applying different ink formatting, and so forth.
- ink for selection provides ways of selecting and processing content using ink in various ways.
- FIG. 13 depicts an example implementation scenario 1300 for ink for selection in accordance with one or more implementations.
- the scenario 1300 for example, represents a continuation of the scenarios described above.
- the upper portion of the scenario 1300 includes a GUI 1302 with a document 1304 and the ink menu 802 (introduced above) displayed on the display 110 .
- the document 1304 represents a web page. It is to be appreciated, however, that techniques discussed herein may utilize a wide variety of other types of documents and content.
- a user manipulates the pen 124 to apply a selection gesture 1306 within the document 1304 , e.g., to a surface of the display 110 within the document 1304 .
- the selection gesture 1306 is applied in an ink selection mode.
- a selection mode may be activated in various ways, such as in response to selection of the selection mode control 814 , in response to selection of the pen mode button 126 , and so forth. For instance, pressing and/or holding the pen mode button 126 activates a selection mode that causes gestures (e.g., ink gestures) to be interpreted according to the ink selection mode.
- applying the selection gesture 1306 causes corresponding ink to be applied along the path of the selection gesture such that a visual indication of the selection gesture 1306 is displayed. Alternatively, no visual indication of the selection gesture 1306 is displayed.
- a selection shape 1308 is automatically generated within the document 1304 .
- the ink module 118 detects that the selection gesture 1306 is applied in the ink selection mode, and in response causes the selection shape 1308 to be automatically generated.
- the shape and size of the selection shape 1308 is based on attributes of the selection gesture 1306 .
- the selection gesture 1306 is a straight line that is horizontal relative to the visual orientation of the document 1304 .
- the selection shape 1308 is drawn as a square, with the size of the square being based on a length of the selection gesture 1306 .
- the selection shape 1308 starts at a start point 1310 of the selection gesture 1306 and expands outwardly from the center point 1310 as the selection gesture 1306 increases in length.
- the start point 1310 represents a center of the selection shape 1308 .
- the length of the selection gesture 1306 represents one-half the length of a side of the selection shape 1308 , and this size relationship is maintained as the selection gesture 1306 changes in length.
- the selection gesture 1306 increases in length, and thus the selection shape 1308 increases in size.
- the selection shape 1308 increases in size to encompass content 1312 displayed as part of the document 1304 .
- the content 1312 represents a web page object.
- the scenario 1300 then proceeds to a scenario 1400 .
- FIG. 14 depicts an example implementation scenario 1400 for ink for selection in accordance with one or more implementations.
- the scenario 1300 for example, represents a continuation of the scenario 1300 described above.
- the upper portion of the scenario 1400 includes the GUI 1302 with the document 1304 (introduced above) displayed on the display 110 .
- a selection release event is detected. For instance, the user lifts the pen 124 such that the pen 124 is not detected in proximity to the surface of the display 110 . Alternatively or additionally, the user selects or releases the pen mode button 126 . Accordingly, and proceeding to the lower portion of the scenario 1400 , the selection shape 1308 is converted into a selection 1402 of content within the selection shape 1308 . In this particular example, the selection 1402 represents a selection of the content 1312 .
- the ink module 118 detects the selection release event and in response, automatically converts the selection shape 1308 into the selection 1402 .
- the content 1312 may be copied, pasted, saved, shared, populated to an ink note (discussed below), and so forth. Examples of different actions that can be performed utilizing selected content are detailed throughout this disclosure.
- FIG. 15 depicts an example implementation scenario 1500 for ink for selection in accordance with one or more implementations.
- the scenario 1500 for example, represents a continuation and/or variation of the scenarios described above.
- the upper portion of the scenario 1500 includes the GUI 1302 with the document 1304 displayed on the display 110 .
- an ink flag 1502 that includes a plus sign (“+”) indicating that an ink selection mode is currently active.
- the plus sign for instance, corresponds to a visual presented for the selection mode control 814 , discussed above with reference to the ink menu 802 .
- the ink flag 1502 provides a visual affordance indicating that the ink selection mode is active.
- the plus sign would be displayed on the display 110 as a hover target beneath the tip of the pen 124 .
- a user manipulates the pen 124 to apply a selection gesture 1504 within the document 1304 .
- the selection gesture 1504 for instance, is applied in an ink selection mode.
- a selection mode may be activated in various ways, examples of which are discussed elsewhere herein.
- applying the selection gesture 1504 causes corresponding ink to be applied along the path of the selection gesture such that a visual indication of the selection gesture 1504 is displayed. Alternatively, no visual indication of the selection gesture 1504 is displayed.
- a selection shape 1506 is automatically generated within the document 1304 .
- the ink module 118 detects that the selection gesture 1504 is applied in the ink selection mode, and in response causes the selection shape 1506 to be automatically generated.
- the shape and size of the selection shape 1506 is based on attributes of the selection gesture 1504 .
- the selection gesture 1504 is a straight line that is diagonal relative to the visual orientation of the document 1304 .
- the selection shape 1506 is drawn as a rectangle, with the size of the rectangle being based on a length of the selection gesture 1504 .
- the selection shape 1506 starts at a start point 1508 of the selection gesture 1504 and expands outwardly from the start point 1508 as the selection gesture 1504 increases in length.
- the start point 1508 represents a corner of the selection shape 1506 .
- the length of the selection gesture 1504 represents a diagonal of the selection shape 1506 (e.g., a diagonal of a rectangle), and this size relationship is maintained as the selection gesture 1504 changes in length.
- the selection gesture 1504 increases in length, and thus the selection shape 1506 increases in size. As illustrated, the selection shape 1506 increases in size to encompass the content 1312 and content 1510 displayed as part of the document 1304 .
- the scenario 1500 then proceeds to a scenario 1600 .
- FIG. 16 depicts an example implementation scenario 1600 for ink for selection in accordance with one or more implementations.
- the scenario 1600 for example, represents a continuation of the scenario 1500 described above.
- the upper portion of the scenario 1600 includes the GUI 1302 with the document 1304 displayed on the display 110 .
- a selection release event is detected. For instance, the user lifts the pen 124 such that the pen 124 is not detected in proximity to the surface of the display 110 . Alternatively or additionally, the user selects or releases the pen mode button 126 . Accordingly, and proceeding to the lower portion of the scenario 1600 , the selection shape 1506 is converted into a selection 1602 of content within the selection shape 1506 .
- the selection 1602 represents a selection of the content 1312 and the content 1510 .
- the ink module 118 detects the selection release event and in response, automatically converts the selection shape 1506 into the selection 1602 .
- various actions may be performed utilizing the selection 1602 .
- the content 1312 , 1510 may be copied, pasted, saved, shared, populated to an ink note (discussed below), and so forth. Examples of different actions that can be performed utilizing selected content are detailed throughout this disclosure.
- a user applies a vertical selection gesture (e.g., relative to the display 110 ) in the document 1304 while in an ink selection mode.
- a selection shape may be drawn as a circle.
- the center of the circle corresponds to a start point of the vertical selection gesture
- the length of the vertical selection gesture corresponds to a radius of the circle.
- the scenarios 1300 - 1600 illustrate that techniques discussed herein enable selection shapes to be drawn based on attributes of a selection gesture. For instance, selection gestures applied as lines in different directions (e.g., horizontal, diagonal, vertical, and so forth) cause different respective types of selection shapes to be drawn.
- selection gestures applied as lines in different directions e.g., horizontal, diagonal, vertical, and so forth
- the examples discussed herein are provided for purpose of illustration only, and it is to be appreciated that implementations discussed herein cover a wide variety of other selection shapes and relationships between selection gestures and selection shapes.
- FIG. 17 depicts an example implementation scenario 1700 for ink for selection in accordance with one or more implementations.
- the scenario 1700 for example, represents a continuation and/or variation of the scenarios described above.
- the upper portion of the scenario 1500 includes the GUI 1302 with the document 1304 displayed on the display 110 . Further included is the ink flag 1502 indicating that an ink selection mode is currently active.
- a user manipulates the pen 124 to apply a selection gesture 1702 within the document 1304 .
- the selection gesture 1702 for instance, is applied in an ink selection mode.
- a selection mode may be activated in various ways, examples of which are discussed elsewhere herein. Notice in this particular example, the selection gesture 1702 is non-linear, i.e., is not a straight line. According to various implementations, a non-linear selection gesture is associated with different selection behaviors than linear selection gesture, such as those described above.
- applying the selection gesture 1702 causes corresponding ink to be applied along the path of the selection gesture such that a visual indication of the selection gesture 1702 is displayed. Alternatively, no visual indication of the selection gesture 1702 is displayed.
- a selection release event is detected. For instance, the user lifts the pen 124 such that the pen 124 is not detected in proximity to the surface of the display 110 . Alternatively or additionally, the user selects or releases the pen mode button 126 . Accordingly, in response to the selection release event, the ink module 118 causes an auto-complete line 1704 to be drawn between a start point 1706 and an end point 1708 of the selection gesture 1702 . The auto-complete line 1704 closes the selection gesture 1702 to generate a closed shape around the content 1312 .
- selection gesture 1702 and the auto-complete line 1704 are depicted as being displayed on the display 110 , it is to be appreciated that in at least some implementations, the selection gesture 1702 and the auto-complete line 1704 are not displayed but represent depictions of underlying selection data tracked by the ink module 118 .
- the scenario 1700 then proceeds to a scenario 1800 .
- FIG. 18 depicts an example implementation scenario 1800 for ink for selection in accordance with one or more implementations.
- the scenario 1800 for example, represents a continuation of the scenario 1700 described above.
- the upper portion of the scenario 1800 includes the GUI 1302 with the document 1304 displayed on the display 110 .
- the closed selection gesture 1702 including the auto-complete line 1704 around the content 1312 , such as depicted in the lower portion of the scenario 1700 .
- the closed selection gesture 1702 is converted into a selection 1802 of the content 1312 .
- the ink module 118 detects the release of the selection gesture 1702 and in response, automatically generates the auto-complete line 1704 and converts the closed selection gesture 1702 into the selection 1802 independent of user input.
- the content 1312 may be copied, pasted, saved, shared, populated to an ink note (discussed below), and so forth. Examples of different actions that can be performed utilizing selected content are detailed throughout this disclosure.
- the example scenarios presented above illustrate different selection gestures representing an open gestures, such as a line, an open curve, and so forth. Further, the open gestures are automatically converted to corresponding closed selection shapes, such as rectangles, circles, closed curves, and so forth.
- FIG. 19 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for ink for selection in accordance with one or more implementations.
- the method represents an extension of the methods described above.
- Step 1900 detects pen input drawing a selection gesture.
- the ink module 118 detects that a user applies a selection gesture to an input surface while in an ink selection mode.
- a visual representation of the selection gesture is presented using ink.
- no visual representation of the selection gesture is presented.
- the selection gesture corresponds to an open gesture, such as a straight line, an open curve, and so forth. For instance, a start point and an end point of the selection gesture do not coincide, and the selection gesture does not intersect itself.
- Step 1902 generates a selection shape based on a direction of the selection gesture.
- the selection shape corresponds to a closed shape, such as a square, a rectangle, a closed curve (e.g., a circle), an irregular closed shape, and so forth.
- the shape of the selection shape depends on an orientation in which the selection gesture is applied relative to a document in which the selection gesture is applied.
- the size of the selection shape is determined based on the length of the selection gesture. For instance, the size of the selection shape increases with an increase in length of the selection gesture.
- generating a selection shape includes generating an auto-complete line between opposite ends of the selection gesture. For instance, if the selection gesture is an open curve (e.g., an arc), an auto-complete line is automatically drawn between opposite ends of the open curve to generate a closed curve.
- an open curve e.g., an arc
- Step 1904 ascertains a release event for the selection gesture.
- the release event for instance, corresponds to a user removing the pen from proximity to the display surface.
- the release event represents the user releasing the pen mode control 126 .
- Step 1906 causes an automatic selection of content within the selection shape responsive to the release event.
- the ink module 118 detects the release event and automatically converts the selection shape into a selection of content encompassed by the selection shape.
- Step 1908 causes an action to be performed utilizing the selected content.
- the ink module 118 causes the selected content to be copied, pasted, shared, populated to an ink note, and so forth.
- the ink module 118 causes the selected content to be propagated to another functionality, such as an application 106 . Examples of other actions that can be applied to selected content are detailed elsewhere herein.
- ink notes provide ways of preserving ink as notes that can be saved, shared, and accessed in various ways.
- FIG. 20 depicts an example implementation scenario 2000 for ink notes in accordance with one or more implementations.
- the scenario 2000 for example, represents a continuation of the scenarios described above.
- the upper portion of the scenario 2000 includes the GUI 302 with the document 304 and the ink menu 802 (introduced above) displayed on the display 110 .
- a user selects the ink note control 822 . For instance, the user taps the ink note control 822 , and/or drags the ink note control 822 from the ink menu 802 into the body of the document 304 .
- an ink note 2002 is presented in the GUI 302 .
- the ink note 2002 represents an electronic canvas on which notes can be applied using ink.
- a user then applies ink content 2004 to the ink note 2002 .
- the scenario 2000 occurs while the GUI 302 is in a transient ink mode. Accordingly, ink content applied to the document 304 itself will behave according to the transient ink mode. However, ink content applied within the ink note 2002 behaves according to an ink note mode. Thus, the ink note 2002 represents a separate inking environment from the document 304 , and thus different behaviors apply to the ink content 2004 than to ink content within the document 304 .
- the ink note 2002 includes a save control 2006 and a share control 2008 .
- selecting the save control 2006 causes the ink content 2004 to be saved to a particular location, such as a pre-specified data storage location.
- a single selection of the save control 2006 causes the ink content 2004 to be saved and the ink note 2002 to be removed from display such that a user may return to interacting with the document 304 .
- the share control 2008 is selectable to share the ink content 2004 , such as with another user. For instance, selecting the share control 2008 causes the ink content 2004 to be automatically propagated to a message, such as the body of an email message, an instant message, a text message, and so forth. A user may then address and send the message to one or more users. Alternatively or additionally, selecting the share control 2008 may cause the ink content 2004 to be posted to a web-based venue, such as a social networking service, a blog, a website, and so forth. According to various implementations, functionality of the share control 2008 is user configurable such that a user may specify behaviors caused by selection of the share control 2008 .
- FIG. 21 depicts an example implementation scenario 2100 for ink notes in accordance with one or more implementations.
- the scenario 2100 for example, represents a continuation and/or variation of the scenarios described above.
- the upper portion of the scenario 2100 includes the GUI 302 with the document 304 and the ink menu 802 (introduced above) displayed on the display 110 .
- a user applies ink content 2102 to the document 302 and then applies a selection action 2104 to the ink content 2102 .
- the selection action 2104 is implemented as inking a closed loop around the ink content 2102 .
- Other selection actions may be utilized, however, such as techniques for ink for selection described above.
- the user selects the ink note control 822 , such as by tapping on the ink note control 822 and/or dragging the ink note control 822 out of the ink menu 802 .
- an ink note 2106 is automatically generated and populated with the ink content 2102 .
- the ink module 118 detects that the ink content 2102 is selected via the selection action 2104 , and thus populates the ink content 2102 to the ink note 2106 .
- the selection action 2104 followed by the selection of the ink note control 822 is interpreted as a command to generate the ink note 2106 and populate the ink note 2106 with the ink content 2102 .
- the ink content is moved (e.g., cut and paste) from the body of the document 304 into the ink note 2106 .
- the ink content is copied into the ink note 2106 such that the ink content 2102 remains in the body of the document 304 .
- the user may then save the ink note 2106 by selecting the save control 2006 , and may share the ink content 2102 by selecting the share control 2008 .
- Example attributes and actions of the save control 2006 and the share control 2008 are described above.
- the scenario 2100 is discussed with reference to populating the ink content 2102 to the ink note 2106 , it is to be appreciated that a wide variety of other content may be populated to the ink note 2106 .
- a user may select a portion of the primary content 306 from the document 304 (e.g., text content), and a subsequent selection of the ink note control 822 would cause the ink note 2102 to be generated and populated with the selected primary content.
- a combination of ink content and primary content can be selected, and a subsequent selection of the ink note control would cause the ink note 2102 to be generated and populated with both the selected ink content and primary content.
- selection of content to be populated to an ink note is performed utilizing techniques for ink for selection described above.
- populating selected content to an ink note as described in this section represents an action that can be performed utilizing selected content, as described above with reference to step 1908 of FIG. 19 .
- ink note functionality is invocable via selection of the ink note control 822 .
- ink note functionality is invocable in other ways, such as in response to a dragging gesture from anywhere within the ink menu 802 into the body of the document 304 , a custom gesture applied anywhere within the GUI 302 , a gesture involving a combination of finger touch input and pen input, a voice command, a touchless gesture, and so forth.
- the scenario 2100 illustrates that techniques discussed herein reduce a number of user interactions required to propagate content to a note (e.g., an ink note), since a user may simply select existing content and invoke an ink note functionality to populate the existing content to an ink note.
- a note e.g., an ink note
- FIG. 22 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for generating an ink note in accordance with one or more implementations.
- the method represents an extension of the methods described above.
- Step 2200 detects a user selection of content.
- the ink module 118 for example, ascertains that a user selects a portion of ink content, primary content, combinations thereof, and so forth.
- content may be selected via techniques for ink for selection, described above.
- Step 2202 ascertains that an ink note functionality is invoked.
- Various ways of invoking ink note functionality are detailed above.
- Step 2204 populates the selected content to an ink note.
- the ink module 118 generates an ink note and populates (e.g., copies or moves) the selected content to the ink note.
- the ink note is generated and the selected content is populated to the ink note automatically and in response to a single user invocation of ink note functionality, e.g., a single user action.
- Step 2206 performs an action in relation to the ink note in response to user input. For instance, a user provides input that causes the ink note to be saved, to be shared, to be deleted, and so forth. Examples of user input include user selection of a selectable control, a user applying an ink and/or touch gesture, input via an input mechanism 112 , and so forth.
- ink for commanding provides ways of causing various commands to be performed in response to ink input.
- FIG. 23 depicts an example implementation scenario 2300 for ink for commanding in accordance with one or more implementations.
- the scenario 2300 for example, represents a continuation of the scenarios described above.
- the upper portion of the scenario 2300 includes the GUI 1302 with the document 1304 and the ink menu 802 (introduced above) displayed on the display 110 .
- the content 1312 is selected as a selection 2302 , such as using techniques for ink for selection described above.
- the pen 124 is hovered above the surface of the display 110 and that a hover target 2304 is displayed beneath the tip of the pen 124 .
- the hover target 2304 includes the visual icon presented for the command control 818 of the ink menu 802 , thus providing a visual affordance that a command mode is currently active.
- the command mode can be activated in various ways, such as in response to a user selection of the command control 818 , a user selection of the pen mode button 126 , and so forth.
- a user applies ink within the selection 2302 to write a command 2306 .
- the command 2306 includes the term “Reminder,” which is interpreted in the command mode as a command to generate a reminder based on the content 1312 included in the selection 2302 .
- the scenario 2300 then proceeds to a scenario 2400 .
- FIG. 24 depicts an example implementation scenario 2400 for ink for commanding in accordance with one or more implementations.
- the scenario 2400 for example, represents a continuation of the scenario 2300 described above.
- the upper portion of the scenario 2400 includes the GUI 1302 with the document 1304 (introduced above) displayed on the display 110 . Further included is the content 1312 selected via the selection 2302 , and the command 2306 inked within the selection 2302 .
- the command 2306 is recognized (e.g., by the ink module 118 ) as a command to generate a reminder based on information from the selected content 1312 .
- a release event is detected, examples of which are discussed above.
- data from the content 1312 is propagated to a calendar 2402 to generate a calendar event 2404 .
- the calendar 2402 for instance, represents a GUI for a calendar application that represents an instance of the applications 106 .
- the content 1312 presents information about an upcoming event.
- information about the upcoming event is ascertained from the content 1312 , and utilized to generate the calendar event 2404 .
- the ink module 118 recognizes that characters displayed as part of the content 1312 are particular words and phrases that have a particular meaning (e.g., date, time, location, etc.), and thus are used to populate relevant fields of the calendar event 2404 .
- metadata for the content 1312 is accessed to ascertain information about the content 1312 .
- Adjacent to and/or overlaid on the calendar 2402 is an ink flag 2406 including a commanding icon.
- the ink flag 2406 with the commanding icon present a visual affordance that a commanding mode is active such that ink input will be interpreted according to the commanding mode.
- the calendar event 2404 is generated automatically and in response to the user selecting the content 1312 and writing the command 2306 . For instance, no further user input is required after writing the command 2306 for the calendar event 2404 to be generated. In response to the command 2306 being written, for example, the calendar 2402 is automatically launched and the calendar event 2404 is generated independent of user input.
- FIG. 25 depicts an example implementation scenario 2500 for ink for commanding in accordance with one or more implementations.
- the scenario 2500 for example, represents a continuation and/or variation of the scenarios described above.
- the upper portion of the scenario 2500 includes the GUI 1302 with the document 1304 (introduced above) displayed on the display 110 .
- the content 1312 is selected as a selection 2502 , such as using techniques for ink for selection described above. Further, a user applies ink within the selection 2502 to enter a command 2504 while in a command mode.
- the command 2504 includes the phrase “email to John Smith.”
- an email message 2506 is generated and populated with information from the content 1312 .
- the ink module 118 parses the command 2504 and recognizes that the term “email” represents a command to generate an email message with selected content of the content 1312 .
- the ink module 118 further recognizes that the term “to John Smith” represents a recipient of the email.
- the email message 2506 is generated by an email application that represents an instance of the applications 106 . For instance, an email address for John Smith is included in contact information for the user, and is thus retrieved by the email application and used to address the email message.
- the content 1312 presents information about an upcoming event.
- information about the upcoming event is ascertained from the content 1312 , and utilized to populate the email message 2506 .
- Example ways of ascertaining information from the content 1312 are discussed above.
- the email message 2506 is automatically generated, populated with information from the content 1312 , and sent to the recipient without any further user input after entering the command 2504 .
- the email message 2506 is automatically generated and populated with information from the content 1312 .
- the user is then given the opportunity to view and edit the email message 2506 prior to sending the email message 2506 .
- scenarios 2300 - 2500 are discussed from the perspective of a selection occurring before a command being entered, it is to be appreciated that the temporal relationship between object selection and commanding may be arranged in various other ways. For instance, a user may first apply ink to write a command, and may subsequently select an object on which the command is to be performed. With reference to the scenario 2500 , for example, the user may first write the command 2504 and then subsequently select the content 1312 to cause the command 2504 to be performed as depicted in the scenario 2500 .
- a command be inked within a selected object.
- the command 2504 may be written anywhere within the document 1304 and outside of the selection of the content 1312 .
- the ink module 118 would recognize that the content 1312 is selected and that the command 2504 is applied, and would cause the command 2504 to be performed utilizing the selected content 1312 .
- a command and a selection are linked whatever the visual and/or temporal relationship is between the command and the selection.
- FIG. 26 depicts an example implementation scenario 2600 for ink for commanding in accordance with one or more implementations.
- the scenario 2600 for example, represents a continuation and/or variation of the scenarios described above.
- the upper portion of the scenario 2600 includes the GUI 1302 with the document 1304 (introduced above) displayed on the display 110 .
- Adjacent to and/or overlaid on the document 1304 is an ink flag 2602 including a commanding icon, indicating that a commanding mode is active.
- the command region 2604 represents a pre-specified portion of the GUI 1302 that is associated with commanding mode functionality.
- a command field 2606 which includes a prompt 2608 for user input.
- the prompt 2608 for instance, prompts the user to input a command into the command field 2606 .
- the user enters a command 2610 into the command field 2608 .
- the command 2610 includes an instruction to search for weather on a particular date.
- the scenario 2600 then proceeds to a scenario 2700 .
- FIG. 27 depicts an example implementation scenario 2700 for ink for commanding in accordance with one or more implementations.
- the scenario 2700 for example, represents a continuation of the scenario 2600 described above.
- the upper portion of the scenario 2700 includes the GUI 1302 with the document 1304 (introduced above) displayed on the display 110 . Further included is the command field with the command 2612 .
- the user removes the pen 124 from proximity to the display 110 such that a release event is generated.
- the command 2612 is executed such that command results 2702 are presented on the display 110 .
- the command 2612 for instance, is submitted as a set of search terms to a web search engine, which performs a search using the search terms and returns the command results 2702 .
- the command results 2702 include weather information for the particular date that is retrieved and displayed on the display 110 .
- the document 1304 for instance, is replaced in the display 110 with the weather information.
- the command results 2702 are retrieved and displayed automatically and in response to the user entering the command 2612 and removing the pen 124 from proximity to the display 110 . For example, the command results 2702 are retrieved and displayed without any further user input after entering the command 2612 .
- the scenarios 2600 , 2700 illustrate that techniques can be employed to provide a designated region in which commands can be entered.
- commands can be entered in a natural language form that can be parsed (e.g., by the ink module 118 ), recognized, and performed by various functionalities.
- the ink module 118 can forward commands to appropriate applications 106 to be recognized and/or performed.
- the scenarios 2300 - 2700 illustrate that techniques discussed herein can be utilized to recognize various commands and to perform various actions based on the commands.
- the commands discussed in these scenarios are presented for purpose of example only, and it is to be appreciated that a wide variety of other commands not expressly discussed herein may be employed in accordance with techniques discussed herein.
- FIG. 28 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method describes an example procedure for ink for commanding in accordance with one or more implementations.
- the method describes an example way of performing one or more of the implementation scenarios described above.
- the method represents an extension of the methods described above.
- Step 2800 detects a selection of content.
- the ink module 118 detects that the content is selected via input from the pen 124 .
- the content is selected via techniques for ink for selection examples of which are described above.
- Step 2802 ascertains input of a command via freehand input from a pen.
- the ink module 118 detects that a command is applied via freehand ink input from the pen 124 .
- Example ways of providing and detecting a command are described above.
- Step 2804 causes the command to be executed using the content.
- the ink module 118 for instance, causes an action to be performed utilizing the content and based on the command.
- the ink module 118 communicates the command and the content and/or attributes of the content to an application 106 to cause the application 106 to perform the command.
- Examples of different actions that can be performed based on a command are described above with reference to the scenarios 2300 - 2700 , such as generating a calendar event and/or an email based on content, performing a search (e.g., a web search) based on a command, and so forth.
- Examples of other actions that can be performed based on a command include sharing selected content to a website (e.g., a social networking site), propagating selected content to a content editing application for editing, saving selected content as a content file, and so forth.
- causing a command to be executed includes causing a visual output of the command to be displayed.
- FIG. 29 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for ink for commanding in accordance with one or more implementations.
- the method describes an example way of performing one or more of the implementation scenarios described above.
- the method represents an extension of the methods described above.
- Step 2900 detects a pen in proximity to a designated command region of an input surface.
- the ink module 118 detects that the pen 124 is in proximity to a command region of the display 110 .
- the command region represents a pre-specified region of a display and/or a GUI that is associated with ink for commanding functionality.
- Step 2902 causes a command field to be presented.
- the command field generally represents a GUI region in which commands can be entered.
- the ink module 118 causes the command field to be displayed automatically and in response to detecting the pen 124 in proximity to the command region.
- the command field is presented in response to a hover of the pen 124 over the input surface and prior to the pen 124 touching the input surface.
- the command field may be presented independent of a user selection to invoke the command field.
- Step 2904 ascertains user input of a command to the command field.
- the ink module 118 detects a command term and/or phrase entered into the command field.
- natural language processing may be employed to parse a command and correlate command terminology to particular machine-based commands.
- Step 2906 causes the command to be performed.
- the ink module 118 performs one or more aspects of the command.
- the ink module 118 forwards the command to an application 106 to cause the application 106 to perform (e.g., execute) the command.
- causing the command to be performed causes a visual display of command results.
- causing a command to be performed causes a reconfiguration of a device state (e.g., of the client device 102 ), such as to change device settings, to change application state, to mitigate errors and/or device malfunctioning, and so forth.
- a command is performed utilizing selected content, such as content selected via techniques for ink for selection described above.
- ink for recognition provides ways of converting ink input into machine-coded characters and shapes.
- FIG. 30 depicts an example implementation scenario 3000 for ink for shape recognition in accordance with one or more implementations.
- the scenario 3000 represents a continuation of the scenarios described above.
- the upper portion of the scenario 3000 includes a GUI 3002 with a document 3004 and the ink menu 802 (introduced above) displayed on the display 110 .
- Displayed within the document 3004 is primary content 3006 , which in this example includes a map of geographic regions.
- a user manipulates the pen 124 to apply ink to draw a freehand shape 3008 a and a freehand shape 3008 b , which in this example are circles. Further, the freehand shapes 3008 a , 3008 b are drawn while in a shape recognition mode. For instance, the user selects the shape recognition control 812 (e.g., before or after drawing the freehand shapes 3008 a , 3008 b ), which causes a transition to a shape recognition mode. Notice further that a hover target 3010 is displayed beneath the tip of the pen 124 , providing a visual affordance that the shape recognition mode is currently active.
- the freehand shapes 3008 a , 3008 b are recognized as circles (e.g., by the ink module 118 ), and thus the freehand shapes 3008 a , 3008 b are converted to respective machine-encoded (“encoded”) shapes 3012 a , 3012 b , i.e., encoded circles. Further, the encoded shapes 3012 a , 3012 b are added to primary content (i.e., a primary content layer) of the document 3004 . According to various implementations, the encoded shapes 3012 a , 3012 b may be edited and manipulated in various ways, such as shaded, filled, resized, moved, and so forth.
- the scenario 3000 illustrates that in a shape recognition mode, shapes drawn in freehand using ink input can be recognized and converted to corresponding machine-encoded shapes. While the scenario 3000 is discussed with reference to recognition of freehand circles, it is to be appreciated that a wide variety of other freehand shapes may be recognized and converted to machine-encoded shapes, such as triangles, rectangles, parallelograms, irregular shapes, and so forth.
- FIG. 31 depicts an example implementation scenario 3100 for ink for text recognition in accordance with one or more implementations.
- the scenario 3100 for example, represents a continuation of the scenarios described above.
- the upper portion of the scenario 3100 includes a GUI 3102 and the ink menu 802 (introduced above) displayed on the display 110 .
- the GUI 3102 represents a GUI for an email application. Depicted within the GUI 3102 is an email inbox 3104 and an email message 3106 .
- a user taps the pen 124 within the body of the email message 3106 , which causes a text prompt 3108 to be displayed.
- the text prompt 3108 represents a visual affordance that a text recognition mode is active, and that text applied using ink will be recognized and positioned starting at the text prompt 3108 .
- the user activates a text recognition mode by selecting the text recognition control 810 from the ink menu 802 .
- the user begins writing freehand text 3110 with the pen 124 within the body of the email message 3106 but in a different region than where the text prompt 3108 is displayed.
- a text guide 3112 is presented as a straight line adjacent to the tip of the pen 124 .
- the text guide 3112 provides visual assistance to enable a user to orient the characters of the freehand text 3110 .
- proper orientation of freehand text increases the accuracy of ink to text recognition by creating a spacing construct that aids in recognition.
- FIG. 32 depicts an example implementation scenario 3200 for ink for text recognition in accordance with one or more implementations.
- the scenario 3200 for example, represents a continuation of the scenario 3100 described above.
- the upper portion of the scenario 3200 includes the GUI 3102 (introduced above) displayed on the display 110 .
- portions of the freehand text 3110 are recognized (e.g., via OCR) and converted to machine-encoded (“encoded”) text 3202 .
- the encoded text 3202 is placed starting at the original position of the text prompt 3108 in the email message 3106 as depicted in the scenario 3100 . Further, the text prompt 3108 moves to indicate where newly recognized text will be presented.
- FIG. 33 depicts an example implementation scenario 3300 for ink for text recognition in accordance with one or more implementations.
- the scenario 3300 for example, represents a continuation of the scenarios 3100 , 3200 described above.
- the scenario 3300 includes the GUI 3102 (introduced above) displayed on the display 110 .
- the freehand text 3110 , 3204 from the scenarios 3100 , 3200 is converted into the encoded text 3202 . Further, the user removes the pen 124 from proximity to the surface of the display 110 . Accordingly, the text guide 3112 presented in the scenarios 3100 , 3200 is removed from display. Should the user begin writing additional freehand text into the GUI 3102 , the text guide 3112 would be redisplayed and the additional freehand text would be recognized and appended to the encoded text 3202 .
- the scenarios 3100 - 3300 illustrate that ink may be applied for text recognition in any portion of a display.
- a user is not constrained to entering freehand text in a predefined region, but may simply enter the freehand text in any portion of a display and the freehand text will be converted to encoded text that is populated to a different region of the display. Accordingly, entry of freehand ink text is not constrained to a region in which a recognized and encoded version of the text will be displayed.
- FIG. 34 depicts an example implementation scenario 3400 for ink for text recognition in accordance with one or more implementations.
- the scenario 3400 for example, represents a continuation of the scenarios described above.
- the scenario 3400 includes the GUI 3102 (introduced above) displayed on the display 110 .
- a user taps the pen 124 within the inbox 3104 , which causes a selection 3402 of the inbox 3104 . Further, notice that a hover target 3404 is presented as a letter “T,” indicating that a text recognition mode is currently active.
- the user enters freehand text 3406 within the selected inbox 3104 .
- the freehand text 3406 includes the letter “W.”
- a text guide 3408 is presented adjacent to the tip of the pen 124 .
- the scenario 3400 then proceeds to a scenario 3500 .
- FIG. 35 depicts an example implementation scenario 3500 for ink for text recognition in accordance with one or more implementations.
- the scenario 3500 for example, represents a continuation of the scenarios described above.
- the scenario 3500 includes the GUI 3102 (introduced above) displayed on the display 110 .
- the user removes the pen 124 from proximity to the display 110 .
- the freehand text 3406 is recognized as an encoded letter “W.”
- email messages listed in the inbox 3104 are processed (e.g., sorted, filtered, and so forth) based on the letter “W.”
- the email messages are rearranged in descending alphabetical order of sender (“from”) starting with the letter “W.”
- the scenarios 3400 , 3500 illustrate that freehand ink input to different regions of a display can be recognized as encoded text and utilized for different purposes, such as to invoke different functionalities.
- freehand ink input is recognized as shapes and/or text, and converted to encoded shapes and text that is populated to a primary content layer of a document.
- the scenarios 3400 , 3500 illustrate implementations where freehand ink input is recognized and utilized to perform an action on existing data, such as searching, filtering, sorting, and so forth.
- FIG. 36 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for ink for recognition in accordance with one or more implementations.
- the method describes an example way of performing one or more of the implementation scenarios described above.
- the method represents an extension of the methods described above.
- Step 3600 detecting a pen in proximity to an input surface while the system is in an ink recognition mode.
- the ink module 118 receives a notification that the pen 124 is in proximity to the surface of the display 110 , such as from the display 110 itself.
- Step 3602 causes a visual affordance identifying the ink recognition mode to be displayed responsive to said detecting.
- a hover target that identifies a shape recognition mode and/or a text recognition mode is displayed below and/or adjacent to the pen 124 .
- an icon that represents a shape recognition mode and/or a text recognition mode is displayed as part of an ink flag and/or an ink menu. Examples of different hover targets and icons are discussed above.
- the visual affordance is removed from display in response to detecting that the pen is removed from proximity to the input surface.
- Step 3604 processes ink content applied to the input surface according to the ink recognition mode. For instance, shapes applied using ink are converted to encoded shapes, and text applied using ink is converted to encoded text. Example ways of processing ink are discussed above and depicted in the accompanying figures.
- FIG. 37 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for ink for text recognition in accordance with one or more implementations.
- the method describes an example way of performing one or more of the implementation scenarios described above.
- the method represents an extension of the methods described above.
- Step 3700 receives freehand pen input to a region of a display.
- the ink module 118 for example, ascertains that freehand pen input is applied within a region of a display.
- the region of the display is not visually identified as a designated region for receiving pen input, e.g., is not visually distinguished from other regions of the display. For instance, a user may randomly select the region.
- Step 3702 converts the freehand pen input to encoded text.
- the freehand pen input includes characters that are recognized as particular text characters, and that are converted into encoded text characters.
- Step 3704 populates the encoded text to a different region of the display.
- the encoded text for instance, is displayed in a different region of the display than the region in which the freehand pen input was provided.
- FIG. 38 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method for instance, describes an example procedure for ink for character recognition in accordance with one or more implementations.
- the method describes an example way of performing one or more of the implementation scenarios described above.
- the method represents an extension of the methods described above.
- Step 3800 detects a user selection of content.
- the ink module 118 detects that a user selects a portion of content. With reference to the example scenarios above, the ink module 118 detects that a user selects the inbox 3104 .
- Step 3802 processes the content based on a character recognized from freehand pen input of a character.
- freehand ink input is recognized as a portion of text, e.g., one or more text characters, and selected content is processed based on the portion of text.
- the freehand input may include non-textual characters (e.g., shapes) that are recognized as having particular meanings. For instance, a square may correspond to a particular type of processing, a circle to a different type of processing, a triangle to yet another different type of processing, and so forth.
- textual and non-textual characters may be recognized to perform associated processing. Examples of processing the content using the portion of text include searching, filtering, sorting, and so forth, using the portion of text.
- different portions of content are associated with different types of processing based on text. For instance, consider the email scenarios described with reference to FIGS. 31-35 . In these scenarios, freehand pen input of a text character within the inbox 3104 is interpreted (e.g., by the ink module 118 and/or an application 106 ) as a command utilize the text character to perform a specific type of processing on messages stored in the inbox 3104 , such as searching, filtering, sorting, and so forth. Further, freehand pen input within the email message 3106 is interpreted as a command to convert the freehand input into encoded text characters, and to populate the encoded text characters into the email message 3106 .
- FIG. 39 illustrates an example system generally at 3900 that includes an example computing device 3902 that is representative of one or more computing systems and/or devices that may implement various techniques described herein.
- the client device 102 discussed above with reference to FIG. 1 can be embodied as the computing device 3902 .
- the computing device 3902 may be, for example, a server of a service provider, a device associated with the client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system.
- the example computing device 3902 as illustrated includes a processing system 3904 , one or more computer-readable media 3906 , and one or more Input/Output (I/O) Interfaces 3908 that are communicatively coupled, one to another.
- the computing device 3902 may further include a system bus or other data and command transfer system that couples the various components, one to another.
- a system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
- a variety of other examples are also contemplated, such as control and data lines.
- the processing system 3904 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 3904 is illustrated as including hardware element 3910 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors.
- the hardware elements 3910 are not limited by the materials from which they are formed or the processing mechanisms employed therein.
- processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)).
- processor-executable instructions may be electronically-executable instructions.
- the computer-readable media 3906 is illustrated as including memory/storage 3912 .
- the memory/storage 3912 represents memory/storage capacity associated with one or more computer-readable media.
- the memory/storage 3912 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth).
- RAM random access memory
- ROM read only memory
- Flash memory optical disks
- magnetic disks magnetic disks, and so forth
- the memory/storage 3912 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth).
- the computer-readable media 3906 may be configured in a variety of other ways as further described below.
- Input/output interface(s) 3908 are representative of functionality to allow a user to enter commands and information to computing device 3902 , and also allow information to be presented to the user and/or other components or devices using various input/output devices.
- input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone (e.g., for voice recognition and/or spoken input), a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to detect movement that does not involve touch as gestures), and so forth.
- Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth.
- the computing device 3902 may be configured in a variety of ways as further described below to support user interaction.
- modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types.
- module generally represent software, firmware, hardware, or a combination thereof.
- the features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- Computer-readable media may include a variety of media that may be accessed by the computing device 3902 .
- computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”
- Computer-readable storage media may refer to media and/or devices that enable persistent storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Computer-readable storage media do not include signals per se.
- the computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data.
- Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.
- Computer-readable signal media may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 3902 , such as via a network.
- Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism.
- Signal media also include any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
- RF radio frequency
- hardware elements 3910 and computer-readable media 3906 are representative of instructions, modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein.
- Hardware elements may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware devices.
- ASIC application-specific integrated circuit
- FPGA field-programmable gate array
- CPLD complex programmable logic device
- a hardware element may operate as a processing device that performs program tasks defined by instructions, modules, and/or logic embodied by the hardware element as well as a hardware device utilized to store instructions for execution, e.g., the computer-readable storage media described previously.
- software, hardware, or program modules and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 3910 .
- the computing device 3902 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of modules that are executable by the computing device 3902 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 3910 of the processing system.
- the instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 3902 and/or processing systems 3904 ) to implement techniques, modules, and examples described herein.
- the example system 3900 enables ubiquitous environments for a seamless user experience when running applications on a personal computer (PC), a television device, and/or a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on.
- PC personal computer
- TV device a television device
- mobile device a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on.
- multiple devices are interconnected through a central computing device.
- the central computing device may be local to the multiple devices or may be located remotely from the multiple devices.
- the central computing device may be a cloud of one or more server computers that are connected to the multiple devices through a network, the Internet, or other data communication link.
- this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to a user of the multiple devices.
- Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices.
- a class of target devices is created and experiences are tailored to the generic class of devices.
- a class of devices may be defined by physical features, types of usage, or other common characteristics of the devices.
- the computing device 3902 may assume a variety of different configurations, such as for computer 3914 , mobile 3916 , and television 3918 uses. Each of these configurations includes devices that may have generally different constructs and capabilities, and thus the computing device 3902 may be configured according to one or more of the different device classes. For instance, the computing device 3902 may be implemented as the computer 3914 class of a device that includes a personal computer, desktop computer, a multi-screen computer, laptop computer, netbook, and so on.
- the computing device 3902 may also be implemented as the mobile 3916 class of device that includes mobile devices, such as a mobile phone, portable music player, portable gaming device, a tablet computer, a wearable device, a multi-screen computer, and so on.
- the computing device 3902 may also be implemented as the television 3918 class of device that includes devices having or connected to generally larger screens in casual viewing environments. These devices include televisions, set-top boxes, gaming consoles, and so on.
- the techniques described herein may be supported by these various configurations of the computing device 3902 and are not limited to the specific examples of the techniques described herein.
- functionalities discussed with reference to the client device 102 and/or ink module 118 may be implemented all or in part through use of a distributed system, such as over a “cloud” 3920 via a platform 3922 as described below.
- the cloud 3920 includes and/or is representative of a platform 3922 for resources 3924 .
- the platform 3922 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 3920 .
- the resources 3924 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from the computing device 3902 .
- Resources 3924 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network.
- the platform 3922 may abstract resources and functions to connect the computing device 3902 with other computing devices.
- the platform 3922 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources 3924 that are implemented via the platform 3922 .
- implementation of functionality described herein may be distributed throughout the system 3900 .
- the functionality may be implemented in part on the computing device 3902 as well as via the platform 3922 that abstracts the functionality of the cloud 3920 .
- aspects of the methods may be implemented in hardware, firmware, or software, or a combination thereof.
- the methods are shown as a set of steps that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. Further, an operation shown with respect to a particular method may be combined and/or interchanged with an operation of a different method in accordance with one or more implementations.
- aspects of the methods can be implemented via interaction between various entities discussed above with reference to the environment 100 .
- a system for selecting content based on pen input and causing an action to be performed on the selected content including: an input surface; one or more processors; and one or more computer-readable storage media storing computer-executable instructions that, responsive to execution by the one or more processors, cause the system to perform operations including: detecting pen input from a pen to the input surface drawing an open selection gesture; generating a closed selection shape based on a particular direction of the selection gesture relative to the input surface; ascertaining a release event for the selection gesture; causing an automatic selection of content within the selection shape responsive to the release event; and causing an action to be performed utilizing the selected content.
- release event includes one or more of the pen being removed from proximity to the input surface, or a selection of a pen button on the pen.
- a computer-implemented method for causing a command to performed based on freehand input of the command including: detecting a selection of content; ascertaining, by logic executed via a computing system, input of a command via freehand input of characters from a pen to an input surface of the computing system; and causing the command to be executed by the computing system using the content.
- a computer-implemented method processing ink input in an ink recognition mode including: detecting a pen in proximity to an input surface of a computing system while the computing system is in an ink recognition mode; causing by the computing system a visual affordance identifying the ink recognition mode to be displayed responsive to said detecting; and processing by the computing system ink content applied to the input surface according to the ink recognition mode.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Techniques for ink modes are described. According to various embodiments, different ink modes are supported. For instance, implementations support ink for selection, ink for commanding, ink for recognition, and so forth. According to various embodiments, a visual affordance of a particular active ink mode is presented on a document with which a user is interacting. For instance, the visual affordance is presented in response to detecting a proximity of a pen to an input surface such as a touch display. Further, different ink modes each are associated with different respective visual affordances.
Description
- This application claims priority under 35 U.S.C. §119(e) to U.S. Provisional Patent Application No. 62/002,648, Attorney Docket Number 355121.01, filed May 23, 2014 and titled “Ink,” the entire disclosure of which is incorporated by reference in its entirety.
- Devices today (e.g., computing devices) typically support a variety of different input techniques. For instance, a particular device may receive input from a user via a keyboard, a mouse, voice input, touch input (e.g., to a touchscreen), and so forth. One particularly intuitive input technique enables a user to utilize a touch instrument (e.g., a pen, a stylus, a finger, and so forth) to provide freehand input to a touch-sensing functionality such as a touchscreen, which is interpreted as digital ink. The freehand input may be converted to a corresponding visual representation on a display, such as for taking notes, for creating and editing an electronic document, and so forth. Many current techniques for digital ink, however, typically provide limited ink functionality.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- Techniques for ink modes are described. According to various embodiments, different ink modes are supported. For instance, implementations support ink for selection, ink for commanding, ink for recognition, and so forth. According to various embodiments, a visual affordance of a particular active ink mode is presented on a document with which a user is interacting. For instance, the visual affordance is presented in response to detecting a proximity of a pen to an input surface such as a touch display. Further, different ink modes each are associated with different respective visual affordances.
- The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
-
FIG. 1 is an illustration of an environment in an example implementation that is operable to employ techniques discussed herein in accordance with one or more embodiments. -
FIG. 2 depicts an example implementation scenario for a permanent ink mode in accordance with one or more embodiments. -
FIG. 3 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments. -
FIG. 4 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments. -
FIG. 5 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments. -
FIG. 6 depicts an example implementation scenario for a transient ink mode in accordance with one or more embodiments. -
FIG. 7 depicts an example implementation scenario for a multiple transient ink layers in accordance with one or more embodiments. -
FIG. 8 depicts an example implementation scenario for presenting an inking menu in accordance with one or more embodiments. -
FIG. 9 is a flow diagram that describes steps in a method for processing ink according to a current ink mode in accordance with one or more embodiments. -
FIG. 10 is a flow diagram that describes steps in a method for a transient ink timer in accordance with one or more embodiments. -
FIG. 11 is a flow diagram that describes steps in a method for propagating transient ink to different transient ink layers for different users in accordance with one or more embodiments. -
FIG. 12 is a flow diagram that describes steps in a method for presenting an ink menu in accordance with one or more embodiments. -
FIG. 13 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments. -
FIG. 14 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments. -
FIG. 15 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments. -
FIG. 16 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments. -
FIG. 17 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments. -
FIG. 18 depicts an example implementation scenario for ink for selection in accordance with one or more embodiments. -
FIG. 19 is a flow diagram that describes steps in a method for ink for selection in accordance with one or more embodiments. -
FIG. 20 depicts an example implementation scenario for ink notes in accordance with one or more embodiments. -
FIG. 21 depicts an example implementation scenario for ink notes in accordance with one or more embodiments. -
FIG. 22 is a flow diagram that describes steps in a method for generating an ink note in accordance with one or more embodiments. -
FIG. 23 depicts an example implementation scenario for ink for commanding in accordance with one or more embodiments. -
FIG. 24 depicts an example implementation scenario for ink for commanding in accordance with one or more embodiments. -
FIG. 25 depicts an example implementation scenario for ink for commanding in accordance with one or more embodiments. -
FIG. 26 depicts an example implementation scenario for ink for commanding in accordance with one or more embodiments. -
FIG. 27 depicts an example implementation scenario for ink for commanding in accordance with one or more embodiments. -
FIG. 28 is a flow diagram that describes steps in a method for ink for commanding in accordance with one or more embodiments. -
FIG. 29 is a flow diagram that describes steps in a method for ink for commanding in accordance with one or more embodiments. -
FIG. 30 depicts an example implementation scenario for ink for shape recognition in accordance with one or more embodiments. -
FIG. 31 depicts an example implementation scenario for ink for text recognition in accordance with one or more embodiments. -
FIG. 32 depicts an example implementation scenario for ink for text recognition in accordance with one or more embodiments. -
FIG. 33 depicts an example implementation scenario for ink for text recognition in accordance with one or more embodiments. -
FIG. 34 depicts an example implementation scenario for ink for text recognition in accordance with one or more embodiments. -
FIG. 35 depicts an example implementation scenario for ink for text recognition in accordance with one or more embodiments. -
FIG. 36 is a flow diagram that describes steps in a method for ink for recognition in accordance with one or more embodiments. -
FIG. 37 is a flow diagram that describes steps in a method for ink for text recognition in accordance with one or more embodiments. -
FIG. 38 is a flow diagram that describes steps in a method for ink for character recognition in accordance with one or more embodiments. -
FIG. 39 illustrates an example system and computing device as described with reference toFIG. 1 , which are configured to implement embodiments of techniques described herein. - Overview
- Techniques for ink modes are described. Generally, ink refers to freehand input to a touch-sensing functionality such as a touchscreen, which is interpreted as digital ink, referred to herein as “ink.” Ink may be provided in various ways, such as using a pen (e.g., an active pen, a passive pen, and so forth), a stylus, a finger, and so forth.
- According to various implementations, different ink modes are supported. For instance, implementations support ink for selection, ink for commanding, ink for recognition, and so forth.
- Ink for selection provides different ways for utilizing ink to select objects, such as visual objects on a display. For instance, different ink gestures applied via freehand input using a pen are converted into different selection shapes for selecting objects. According to various implementations, ink for selection reduces an amount of time and a number of user interactions required to select an object.
- Ink for commanding provides different ways for utilizing ink to specify various commands to be performed. For instance, different ink commands applied via freehand input using a pen are recognized and automatically executed. According to various implementations, ink for commanding reduces an amount of time and a number of user interactions required to enter and execute commands. For instance, a user may simply write a command using ink, and the command is automatically recognized and executed without requiring the user to locate and select a visual control or menu item for the command.
- Ink for recognition provides different ways for recognizing and converting characters provided via freehand ink. For instance, different ink characters applied via freehand input using a pen are recognized and converted into encoded versions of different shapes and text characters. The shapes and text characters, for instance, are added to a primary content layer of a document. According to various implementations, ink for recognition reduces an amount of time and a number of user interactions required to generate encoded characters, such as shapes, text, and so forth.
- According to various implementations, a visual affordance of a particular active ink mode is presented on a document with which a user is interacting. For instance, the visual affordance is presented in response to detecting a proximity of a pen to an input surface such as a touch display. Further, different ink modes each are associated with different respective visual affordances. Thus, a user is informed of which ink mode is currently active without having to access a settings menu or other interface separate from a document with which the user is interacting, thus reducing user interactions required to ascertain an active ink mode.
- In the following discussion, an example environment is first described that is operable to employ techniques described herein. Next, a section entitled “Example Implementation Scenarios and Procedures” describes some example implementation scenarios and methods for ink modes in accordance with one or more embodiments. Finally, a section entitled “Example System and Device” describes an example system and device that are operable to employ techniques discussed herein in accordance with one or more embodiments.
- Example Environment
-
FIG. 1 is an illustration of anenvironment 100 in an example implementation that is operable to employ techniques for ink modes discussed herein.Environment 100 includes aclient device 102 which can be embodied as any suitable device such as, by way of example and not limitation, a smartphone, a tablet computer, a portable computer (e.g., a laptop), a desktop computer, a wearable device, and so forth. In at least some implementations, theclient device 102 represents a smart appliance, such as an Internet of Things (“IoT”) device. Thus, theclient device 102 may range from a system with significant processing power, to a lightweight device with minimal processing power. One of a variety of different examples of aclient device 102 is shown and described below inFIG. 39 . - The
client device 102 includes a variety of different functionalities that enable various activities and tasks to be performed. For instance, theclient device 102 includes anoperating system 104,applications 106, and acommunication module 108. Generally, theoperating system 104 is representative of functionality for abstracting various system components of theclient device 102, such as hardware, kernel-level modules and services, and so forth. Theoperating system 104, for instance, can abstract various components (e.g., hardware, software, and firmware) of theclient device 102 to theapplications 106 to enable interaction between the components and theapplications 106. - The
applications 106 represents functionalities for performing different tasks via theclient device 102. Examples of theapplications 106 include a word processing application, a spreadsheet application, a web browser, a gaming application, and so forth. Theapplications 106 may be installed locally on theclient device 102 to be executed via a local runtime environment, and/or may represent portals to remote functionality, such as cloud-based services, web apps, and so forth. Thus, theapplications 106 may take a variety of forms, such as locally-executed code, portals to remotely hosted services, and so forth. - The
communication module 108 is representative of functionality for enabling theclient device 102 to communication over wired and/or wireless connections. For instance, thecommunication module 108 represents hardware and logic for communication via a variety of different wired and/or wireless technologies and protocols. - The
client device 102 further includes adisplay device 110, input mechanisms 112 including adigitizer 114 andtouch input devices 116, and anink module 118. Thedisplay device 110 generally represents functionality for visual output for theclient device 102. Additionally, thedisplay device 110 represents functionality for receiving various types of input, such as touch input, pen input, and so forth. The input mechanisms 112 generally represent different functionalities for receiving input to thecomputing device 102. Examples of the input mechanisms 112 include gesture-sensitive sensors and devices (e.g., such as touch-based sensors and movement-tracking sensors (e.g., camera-based)), a mouse, a keyboard, a stylus, a touch pad, accelerometers, a microphone with accompanying voice recognition software, and so forth. The input mechanisms 112 may be separate or integral with thedisplays 110; integral examples include gesture-sensitive displays with integrated touch-sensitive or motion-sensitive sensors. Thedigitizer 114 represents functionality for converting various types of input to thedisplay device 110 and thetouch input devices 116 into digital data that can be used by thecomputing device 102 in various ways, such as for generating digital ink. - According to various implementations, the
ink module 118 represents functionality for performing various aspects of techniques for ink modes discussed herein. Various functionalities of theink module 118 are discussed below. Theink module 118 includes a transient layer application programming interface (API) 120 and apermanent layer API 122. Thetransient layer API 120 represents functionality for enabling interaction with a transient ink layer, and thepermanent layer API 122 represents functionality for enabling ink interaction with a permanent object (e.g., document) layer. In at least some implementations, thetransient layer API 120 and thepermanent layer API 122 may be utilized (e.g., by the applications 106) to access transient ink functionality and permanent ink functionality, respectively. - The
environment 100 further includes apen 124, which is representative of an input device for providing input to thedisplay device 110. Generally, thepen 124 is in a form factor of a traditional pen but includes functionality for interacting with thedisplay device 110 and other functionality of theclient device 102. In at least some implementations, thepen 124 is an active pen that includes electronic components for interacting with theclient device 102. Thepen 124, for instance, includes a battery that can provide power to internal components of thepen 124. - Alternatively or additionally, the
pen 124 may include a magnet or other functionality that supports hover detection over thedisplay device 110. This is not intended to be limiting, however, and in at least some implementations thepen 124 may be passive, e.g., a stylus without internal electronics. Generally, thepen 124 is representative of an input device that can provide input that can be differentiated from other types of input by theclient device 102. For instance, thedigitizer 114 is configured to differentiate between input provided via thepen 124, and input provided by a different input mechanism such as a user's finger, a stylus, and so forth. - The
pen 124 includes apen mode button 126, which represents a selectable control (e.g., a switch) for switching thepen 124 between different pen input modes. Generally, different pen input modes enable input from thepen 124 to be utilized and/or interpreted by theink module 118 in different ways. Examples of different pen input modes are detailed below. - Having described an example environment in which the techniques described herein may operate, consider now a discussion of an example implementation scenario in accordance with one or more embodiments.
- Transient Ink and Permanent Ink
- According to various implementations, ink can be applied in different ink modes including a transient ink mode and a permanent ink mode. Generally, transient ink refers to ink that is temporary and that can be used for various purposes, such as invoking particular actions, annotating a document, and so forth. For instance, in transient implementations, ink can be used for annotation layers for electronic documents, temporary visual emphasis, text recognition, invoking various commands and functionalities, and so forth.
- Permanent ink generally refers to implementations where ink becomes a part of the underlying object, such as for creating a document, writing on a document (e.g., for annotation and/or editing), applying ink to graphics, and so forth. Permanent ink, for example, can be considered as a graphics object, such as for note taking, for creating visual content, and so forth.
- In at least some implementations, a pen (e.g., the pen 124) applies ink whenever the pen is in contact with an input surface, such as the
display device 104 and/or other input surface. Further, a pen can apply ink across many different applications, platforms, and services. In one or more implementations, an application and/or service can specify how ink is used in relation to an underlying object, such as a word processing document, a spreadsheet and so forth. For instance, in some scenarios ink is applied as transient ink, and other scenarios ink is applied as permanent ink. Examples of different implementations and attributes of transient ink and permanent ink are detailed below. - Example Implementation Scenarios and Procedures
- This section describes some example implementation scenarios and example procedures for ink modes in accordance with one or more implementations. The implementation scenarios and procedures may be implemented in the
environment 100 described above, thesystem 3900 ofFIG. 39 , and/or any other suitable environment. The implementation scenarios and procedures, for example, describe example operations of theclient device 102 and theink module 118. While the implementation scenarios and procedures are discussed with reference to a particular application, it is to be appreciated that techniques for ink modes discussed herein are applicable across a variety of different applications, services, and environments. In at least some embodiments, steps described for the various procedures are implemented automatically and independent of user interaction. -
FIG. 2 depicts anexample implementation scenario 200 for a permanent ink mode in accordance with one or more implementations. The upper portion of thescenario 200 includes a graphical user interface (GUI) 202 displayed on thedisplay 110. Generally, theGUI 202 represents a GUI for a particular functionality, such as an instance of theapplications 106. Also depicted is a user holding thepen 124. Displayed within theGUI 202 is adocument 204, e.g., an electronic document generated via one of theapplications 106. - Proceeding to the lower portion of the
scenario 200, the user brings thepen 124 in proximity to the surface of thedisplay 110 and within theGUI 202. Thepen 124, for instance, is placed within a particular distance of the display 110 (e.g., less than 2 centimeters) but not in contact with thedisplay 110. This behavior is generally referred to herein as “hovering” thepen 124. In response to detecting proximity of thepen 124, a hovertarget 206 is displayed within theGUI 202 and at a point within theGUI 202 that is directly beneath the tip of thepen 124. Generally, the hovertarget 206 represents a visual affordance that indicates that ink functionality is active such that a user may apply ink to thedocument 204. - According to various implementations, the visual appearance (e.g., shape, color, shading, and so forth) of the hover
target 206 provides a visual cue indicating a current ink mode that is active. In thescenario 200, the hover target is presented as a solid circle, which indicates that a permanent ink mode is active. For instance, if the user proceeds to put thepen 124 in contact with thedisplay 110 to apply ink to thedocument 204 in a permanent ink mode, the ink will become part of thedocument 204, e.g., will be added to a primary content layer of thedocument 204. Consider, for example, that the text (e.g., primary content) displayed in thedocument 204 was created via ink input in a permanent ink mode. Thus, ink applied in a permanent ink mode represents a permanent ink layer that is added to a primary content layer of thedocument 204. - In further response to detecting hovering of the
pen 124, anink flag 208 is visually presented adjacent to and/or at least partially overlaying a portion of thedocument 204. Generally, theink flag 208 represents a visual affordance that indicates that ink functionality is active such that a user may apply ink to thedocument 204. In at least some implementations, theink flag 208 may be presented additionally or alternatively to the hovertarget 206. In this particular example, theink flag 208 includes a visual cue indicating a current ink mode that is active. In thescenario 200, theink flag 208 includes a solid circle, which indicates that a permanent ink mode is active. As further detailed below, theink flag 208 is selectable to cause an ink menu to be displayed that includes various ink-related functionalities, options, and settings that can be applied. -
FIG. 3 depicts anexample implementation scenario 300 for a transient ink mode in accordance with one or more implementations. The upper portion of thescenario 300 includes a graphical user interface (GUI) 302 displayed on thedisplay 110. Generally, theGUI 302 represents a GUI for a particular functionality, such as an instance of theapplications 106. Displayed within theGUI 302 is adocument 304, e.g., an electronic document generating via one of theapplications 106. Thedocument 304 includesprimary content 306, which represents content generated as part of a primary content layer for thedocument 304. For instance, in this particular example thedocument 304 is a text-based document, and thus theprimary content 306 includes text that is populated to the document. Various other types of documents and primary content may be employed, such as for graphics, multimedia, web content, and so forth. - As further illustrated, a user is hovering the
pen 124 within a certain proximity of the surface of thedisplay 110, such as discussed above with reference to thescenario 200. In response, a hovertarget 308 is displayed within thedocument 304 and beneath the tip of the pen. In this particular example, the hovertarget 308 is presented as a hollow circle, thus indicating that a transient ink mode is active. For instance, if the user proceeds to apply ink to thedocument 304, the ink will behave according to a transient ink mode. Examples of different transient ink behaviors are detailed elsewhere herein. - Further in response to the user hovering the
pen 124 over thedisplay 110, anink flag 310 is presented. In this particular example, theink flag 310 includes ahollow circle 312, thus providing a visual cue that a transient ink mode is active. - Proceeding to the lower portion of the
scenario 300, the user removes thepen 124 from proximity to thedisplay 110. In response, the hovertarget 308 and theink flag 310 are removed from thedisplay 110. For instance, in at least some implementations, a hover target and/or an ink flag are presented when thepen 124 is detected as being hovered over thedisplay 110, and are removed from thedisplay 110 when thepen 124 is removed such that thepen 124 is no longer detected as being hovered over thedisplay 110. This is not intended to be limiting, however, and in at least some implementations, an ink flag may be persistently displayed to indicate that inking functionality is active and/or available. -
FIG. 4 depicts anexample implementation scenario 400 for a transient ink mode in accordance with one or more implementations. The upper portion of thescenario 300 includes theGUI 302 with the document 304 (introduced above) displayed on thedisplay 110. In at least some implementations, thescenario 400 represents an extension of thescenario 300, above. - In the upper portion of the
scenario 400, a user appliesink content 402 to thedocument 304 using thepen 124. In this particular scenario, theink content 402 corresponds to an annotation of thedocument 402. It is to be appreciated, however, that a variety of different types of transient ink other than annotations may be employed. Notice that as the user is applying theink content 402, a hover target is not displayed. For instance, in at least some implementations when thepen 124 transitions from a hover position to contact with thedisplay 110, a hover target is removed. Notice also that theink flag 310 includes ahollow circle 312, indicating that theink content 402 is applied according to a transient ink mode. - Proceeding to the lower portion of the
scenario 400, the user lifts thepen 124 from thedisplay 110 such that thepen 124 is not detected, e.g., thepen 124 is not in contact with thedisplay 110 and is not in close enough proximity to thedisplay 110 to be detected as hovering. In response to thepen 124 no longer being detected in contact with or in proximity to thedisplay 110, anink timer 406 begins running. For instance, theink timer 406 begins counting down from a specific time value, such as 30 seconds, 60 seconds, and so forth. Generally, the ink timer is representative of functionality to implement a countdown function, such as for tracking time between user interactions with thedisplay 110 via thepen 124. Theink timer 406, for example, represents a functionality of theink module 118. - As a visual cue that the
ink counter 406 is elapsing, thehollow circle 312 begins to unwind, e.g., begins to disappear from theink flag 310. In at least some implementations, thehollow circle 312 unwinds at a rate that corresponds to the countdown of theink timer 406. For instance, when theink timer 406 is elapsed by 50%, then 50% of thehollow circle 312 is removed from theink flag 310. Thus, unwinding of thehollow circle 312 provides a visual cue that theink timer 406 is elapsing, and how much of the ink timer has elapsed and/or remains to be elapsed. - In at least some implementations, if the
ink timer 406 is elapsing as in the lower portion of thescenario 400 and the user proceeds to place thepen 124 in proximity to the display 110 (e.g., hovered or in contact with the display 110), theink timer 406 will reset and will not begin elapsing again until the user removes thepen 124 from thedisplay 110 such that thepen 124 is not detected. In such implementations, thehollow circle 312 will be restored within theink flag 310 as in the upper portion of thescenario 400. -
FIG. 5 depicts anexample implementation scenario 500 for a transient ink mode in accordance with one or more implementations. The upper portion of thescenario 300 includes theGUI 302 with the document 304 (introduced above) displayed on thedisplay 110. In at least some implementations, thescenario 500 represents an extension of thescenario 400, above. - In the upper portion of the
scenario 500, theink timer 406 has elapsed. For instance, notice that thehollow circle 312 has completely unwound within theink flag 310, e.g., is visually removed from theink flag 310. According to various implementations, this provides a visual cue that theink timer 406 has completely elapsed. - Proceeding to the lower portion of the
scenario 500, and in response to expiry of theink timer 406, theink content 402 is removed from theGUI 302 and saved as part of atransient ink layer 504 for thedocument 304. Further, theink flag 310 is populated with auser icon 502. Theuser icon 502, for example, represents a user that is currently logged in to thecomputing device 102, and/or a user that is interacting with thedocument 304. Alternatively or additionally, thepen 124 includes user identification data that is detected by thecomputing device 102 and thus is leveraged to track which user is interacting with thedocument 304. For example, thepen 124 includes a tagging mechanism (e.g., a radio-frequency identifier (RFID) chip) embedded with a user identity for a particular user. Thus, when thepen 124 is placed in proximity to thedisplay 110, the tagging mechanism is detected by thecomputing device 102 and utilized to attribute ink input and/or other types of input to a particular user. As used herein, the term “user” may be used to refer to an identity for an individual person, and/or an identity for a discrete group of users that are grouped under a single user identity. - According to various implementations, population of the
user icon 502 to theink flag 310 represents a visual indication that thetransient ink layer 504 exists for thedocument 304, and that thetransient ink layer 504 is associated with (e.g., was generated by) a particular user. Generally, thetransient ink layer 504 represents a data layer that is not part of the primary content layer of thedocument 304, but that is persisted and can be referenced for various purposes. Further attributes of transient ink layers are described elsewhere herein. -
FIG. 6 depicts anexample implementation scenario 600 for a transient ink mode in accordance with one or more implementations. The upper portion of thescenario 600 includes theGUI 302 with the document 304 (introduced above) displayed on thedisplay 110. In at least some implementations, thescenario 600 represents an extension of thescenario 500, above. - In the upper portion of the
scenario 600, theink flag 310 is displayed indicating that a transient ink layer (e.g., the transient ink layer 504) exists for thedocument 304, and that the transient ink layer is linked to a particular user represented by theuser icon 502 in theink flag 310. - Proceeding to the lower portion of the
scenario 600, a user selects theink flag 310 with thepen 124, which causes theink content 402 to be returned to display as part of thedocument 304. Theink content 402, for example, is bound to thetransient ink layer 504, along with other transient ink content generated for thetransient ink layer 504. Thus, in at least some implementations, thetransient ink layer 504 is accessible by various techniques, such as by selection of theink flag 310. - Additionally or alternatively to selection of the
ink flag 310, if the user proceeds to apply further ink content to thedocument 304 while in the transient ink mode, thetransient ink layer 504 is retrieved and transient ink content included as part of thetransient ink layer 504 is displayed as part of thedocument 504. In at least some implementations, transient ink content of thetransient ink layer 504 is bound (e.g., anchored) to particular portions (e.g., pages, lines, text, and so forth) of thedocument 304. For instance, the user generated theink content 402 adjacent to a particular section of text. Thus, when thetransient ink layer 504 is recalled as depicted in thescenario 600, theink content 402 is displayed adjacent to the particular section of text. - According to various implementations, the
transient ink layer 504 is cumulative such that a user may add ink content to and remove ink content from thetransient ink layer 504 over a span of time and during multiple different interactivity sessions. Thus, thetransient ink layer 504 generally represents a record of multiple user interactions with thedocument 304, such as for annotations, proofreading, commenting, and so forth. Alternatively or additionally, multiple transient layers may be created for thedocument 304, such as when significant changes are made to theprimary content 306, when other users apply transient ink to thedocument 304, and so forth. - In at least some implementations, when the user pauses interaction with the
document 304, theink timer 406 begins elapsing such as discussed above with reference to thescenarios scenario 600 may return to thescenario 400. -
FIG. 7 depicts anexample implementation scenario 700 for a multiple transient ink layers in accordance with one or more implementations. The upper portion of thescenario 600 includes theGUI 302 with the document 304 (introduced above) displayed on thedisplay 110. In at least some implementations, thescenario 700 represents an extension of thescenario 600, above. - Displayed as part of the
GUI 302 is theink flag 310 with theuser icon 502, along with anink flag 702 with auser icon 704, and anink flag 706 with auser icon 708. Generally, the ink flags 702, 706 represent other users that have interacted with thedocument 304, and theuser icons - The
scenario 700 further includes thetransient ink layer 504 associated with theink flag 310, along with atransient ink layer 710 linked to theink flag 702, and atransient ink layer 712 linked to theink flag 706. Generally, the transient ink layers 710, 712 represent individual transient ink layers that are bound to individual user identities. Each individualtransient ink layer document 304. Example ways of invoking a transient ink layer are detailed elsewhere herein. Further, transient ink layer behaviors discussed elsewhere herein are applicable to thescenario 700. -
FIG. 8 depicts anexample implementation scenario 800 for presenting an inking menu in accordance with one or more implementations. The upper portion of thescenario 800 includes theGUI 302 with the document 304 (introduced above) displayed on thedisplay 110. In at least some implementations, thescenario 800 represents an extension of the scenarios discussed above. Further to thescenario 800, the user selects theink flag 310 while thetransient layer 504 is active. For instance, the user manipulates thepen 124 to first tap theink flag 310, which invokes thetransient ink layer 504 such that theink content 402 is retrieved and displayed, and then taps the ink flag 310 a second time within a particular period of time, e.g., 3 seconds. - Proceeding to the lower portion of the
scenario 800 and in response to the second selection of theink flag 310, theink flag 310 expands to present anink menu 802. Generally, theink menu 802 includes multiple selectable indicia that are selectable to cause different ink-related actions to be performed, such as to apply and/or change various settings, invoke various functionalities, and so forth. To aid in visual understanding, an expandedrepresentation 802 a of theink menu 802 is depicted. The example visual indicia included in theink menu 802 are now discussed in turn. -
Play control 804—according to various implementations, when ink content (e.g., transient and/or permanent ink) is applied to a document, application of the ink content is recorded in real-time. For instance, application of ink content is recorded as an animation that shows the ink content being applied to a document as it was initially applied by a user. Accordingly, selection of theplay control 804 causes a playback of ink content as it was originally applied. Further details concerning ink playback are presented below. -
Transient Ink Control 806—selection of this control causes a transition from a different ink mode (e.g., a permanent ink mode) to a transient ink mode. -
Permanent Ink Control 808—selection of this control causes a transition from a different ink mode (e.g., a transient ink mode) to a permanent ink mode. -
Text Recognition Control 810—selection of this control causes a transition to a text recognition mode. For instance, in a text recognition mode, characters applied using ink are converted into machine-encoded text. -
Shape Recognition Control 812—selection of this control causes a transition to a shape recognition mode. For instance, in a shape recognition mode, shapes applied using ink are converted into machine-encoded shapes, such as quadrilaterals, triangles, circles, and so forth. -
Selection Mode Control 814—selection of this control causes a transition to a selection mode. Generally, in a selection mode, input from a pen is interpreted as a selection action, such as to select text and/or other objects displayed in a document. - Erase
Mode Control 816—selection of this control causes a transition to an erase mode. Generally, in an erase mode, input from a pen is interpreted as an erase action, such as to erase ink, text, and/or other objects displayed in a document. -
Command Control 818—selection of this control causes a transition to a command mode. For instance, in a command mode, input from a pen is interpreted as a command to perform a particular action and/or task. -
Color Control 820—selection of this control enables a user to change an ink color that is applied to a document. For example, selection of this control causes a color menu to be presented that includes multiple different selectable colors. Section of a color from the color menu specifies the color for ink content that is applied to a document. -
Ink Note Control 822—this control is selectable to invoke ink note functionality, such as to enable ink content to be propagated to a note Ink note functionality is described in more detail below. -
Emphasis Control 824—selection of this control causes a transition from a different ink mode (e.g., a permanent or transient ink mode) to an emphasis ink mode. Generally, in an emphasis ink mode, ink is temporary and fades and disappears after a period of time. Emphasis ink, for example, is not saved as part of primary content or a transient ink layer, but is used for temporary purposes, such as for visually identifying content, emphasizing content, and so forth. -
Pin Control 826—this control is selectable to pin a transient ink layer to a document, and to unpin the transient ink layer from the document. For instance, selecting thepin control 826 causes transient ink of a transient ink layer to be persistently displayed as part of a document. With reference to thescenario 500, for example, selection of thepin control 826 prevents theink timer 406 from being initiated when a user removes thepen 124 from proximity to thedisplay 110. - The
pin control 826 is also selectable to unpin a transient ink layer from a document. For instance, with reference to thescenario 500, selection of thepin control 826 unpins a transient ink layer such that theink timer 406 begins to elapse when a user removes thepen 124 from proximity to thedisplay 110. In at least some implementations, when the user selects theink flag 310 to cause theink menu 802 to be presented, thepin control 826 is presented within theink menu 802 in the same region of the display in which theuser icon 502 is displayed in theink flag 310. Thus, a user can double-tap on the same spot in the over theink flag 310 to cause the ink menu to be presented, and to pin and unpin transient ink from thedocument 304. - In at least some implementations, the visuals presented for the individual controls represent hover targets that are displayed when the respective modes are active. The example implementation scenarios above depict examples of hover targets for transient and permanent ink modes, and similar scenarios apply for other ink modes and the visuals displayed for their respective controls in the
ink menu 802. - According to one or more implementations, providing input outside of the
ink menu 802 causes theink menu 802 to collapse. For instance, if the user taps thepen 124 in theGUI 302 outside of theink menu 802, theink menu 802 collapses such that theink flag 310 is again displayed. -
FIG. 9 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for processing ink according to a current ink mode in accordance with one or more embodiments. - Step 900 detects a pen in proximity to an input surface. The
touch input device 116, for instance, detects that thepen 124 is hovered and/or in contact with thetouch input device 116. As referenced above, a hover operation can be associated with a particular threshold proximity to an input surface such that hovering thepen 124 at or within the threshold proximity to the input surface is interpreted as a hover operation, but placing thepen 124 farther than the threshold proximity from the input surface is not interpreted as a hover operation. - Step 902 ascertains a current ink mode. The
ink module 118, for example, ascertains an ink mode that is currently active on thecomputing device 102. Examples of different ink modes are detailed elsewhere herein, and include a permanent ink mode, a transient ink mode, a text recognition mode, a shape recognition mode, a selection mode, an erase mode, a command mode, and so forth. - In at least some implementations, a current ink mode may be automatically selected by the
ink module 118, such as based on an application and/or document context that is currently in focus. For instance, anapplication 106 may specify a default ink mode that is to be active for the application. Further, some applications may specify ink mode permissions that indicate allowed and disallowed ink modes. Aparticular application 106, for example, may specify that a permanent ink mode is not allowed for documents presented by the application, such as to protect documents from being edited. - Alternatively or additionally, a current ink mode is user-selectable, such as in response to user input selecting an ink mode from the
ink menu 802. For instance, a user may cause a switch from a default ink mode for an application to a different ink mode. - Step 904 causes a visual affordance identifying the current ink mode to be displayed. Examples of such an affordance include a hover target, a visual included as part of an ink flag and/or ink-related menu, and so forth. Examples of different visual affordances are detailed throughout this description and the accompanying drawings.
- Step 906 processes ink content applied to the input surface according to the current ink mode. The ink content, for instance, is processed as permanent ink, transient ink, and so forth. For example, if a permanent ink mode is active, the ink content is saved as permanent ink, such as part of a primary content layer of a document. If the transient ink mode is active, the ink content is propagated to a transient ink layer of a document. Examples of different mode-specific ink behaviors and actions are detailed elsewhere herein.
-
FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for a transient ink timer in accordance with one or more implementations. In at least some implementations, the method represents an extension of the method described above with reference toFIG. 9 . -
Step 1000 receives ink content applied to a document via input from a pen to an input surface while in a transient ink mode. Theink module 118, for example, processes ink content received from thepen 124 to thedisplay 110 as transient ink. -
Step 1002 detects that the pen is removed from proximity to the input surface. For instance, thetouch input device 116 detects that thepen 124 is not in contact with and is not hovering over a surface of thetouch input device 116, e.g., thedisplay 110. -
Step 1004 initiates a timer. The timer, for example, is initiated in response to detecting that the pen is removed from proximity to the input surface. In at least some implementations, a visual representation of the timer is presented. For instance, the visual representation provides a visual cue that the timer is elapsing, and indicates a relative amount (e.g., percentage) of the timer that has elapsed. The visual representation, for example, is animated to visually convey that the timer is elapsing. One example of a visual representation of a timer is discussed above with reference toFIGS. 4 and 5 . -
Step 1006 ascertains whether the pen is detected at the input surface before the timer expires. For instance, theink module 118 ascertains whether thepen 124 is detected is contact with and/or hovering over thetouch input device 116 prior to expiry of the timer. If the pen is detected at the input surface prior to expiry of the timer (“Yes”),step 1008 resets the timer and the process returns to step 1000. - If the pen is not detected at the input surface prior to expiry of the timer (“No”),
step 1010 removes the ink content from the document and propagates the ink content to a transient layer for the document. For instance, response to expiry of the timer, the transient ink content is removed from display and propagated to a transient data layer for the document that is separate from a primary content layer of the document. In at least some implementations, a new transient ink layer is created for the document, and the transient ink content is propagated to the new transient ink layer. Alternatively or additionally, the transient ink content is propagated to an existing transient ink layer. For example, the transient ink layer may represent an accumulation of transient ink provided by a user over multiple different interactions with the document and over a period of time. - As discussed above, the transient ink layer may be associated with a particular user, e.g., a user that applies the transient ink content to the document. Thus, the transient ink is linked to the particular user and may subsequently be accessed by the user.
-
FIG. 11 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for propagating transient ink to different transient ink layers for different users in accordance with one or more implementations. In at least some implementations, the method represents an extension of the methods described above with reference toFIGS. 9 and 10 . -
Step 1100 receives transient ink content to a document from multiple different users. The transient ink content, for instance, is received during different interactivity sessions with the document that are individually associated with a different user. The document, for example, a shared among different users, such as part of a group collaboration on the document. -
Step 1102 propagates transient ink content from each user to a different respective transient ink layer for the document. A different transient ink layer, for example, is generated for each user, and transient ink content applied by each user is propagated to a respective transient ink layer for each user. -
Step 1104 causes visual affordances of the different transient ink layers to be displayed. Each transient ink layer, for example, is represented by a visual affordance that visually identifies the transient ink layer and a user linked to the transient ink layer. Examples of such affordances are discussed above with reference to ink flags. -
Step 1106 enables each transient ink layer to be individually accessible. Theink module 118, for example, enables each transient ink layer to be accessed (e.g., displayed) separately from the other transient ink layers. In at least some implementations, a transient ink layer is accessible by selecting a visual affordance that represents the transient ink layer. Further, multiple transient ink layers may be accessed concurrently, such as by selecting visual affordances that identify the transient ink layers. - While implementations are discussed herein with reference to display of transient ink layers, it is to be appreciated that a transient ink layer may be accessible in various other ways and separately from a document to which the transient ink layer is bound. For instance, a transient ink layer may be printed, shared (e.g., emailed) separately from a document for which the transient ink layer is created, published to a website, and so forth.
-
FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for presenting an ink menu in accordance with one or more implementations. In at least some implementations, the method represents an extension of the methods described above. -
Step 1200 detects an action to invoke an ink menu. Theink module 118, for example, detects that a user requests an ink menu. For instance, as described in thescenario 800, a user may select a visual control (e.g., the ink flag 310) to request an ink menu. Alternatively or additionally, an ink menu can be automatically invoked in response to various events, such as thepen 124 detected in proximity to the surface of the display surface, an ink-related application and/or service being launched, an application and/or service querying a user to select an ink mode, and so forth. -
Step 1202 causes the ink menu to be presented. Theink module 118, for example, causes theink menu 802 to be displayed. In at least some implementations, a default set of functionalities is associated by theink module 118 with theink menu 802. Accordingly, different applications may modify the default set of functionalities, such as by adding a functionality to the default set of functionalities, removing a functionality from the default set of functionalities, and so forth. Thus, according to one or more implementations, theink menu 802 is populated with a set of functionalities (e.g., selectable controls) based on a customized set of functionalities specified by and/or for an application that is currently in focus. -
Step 1204 receives user input to the ink menu. For example, theink module 118 detects that a user manipulates thepen 124 to select a control displayed as part of theink menu 802. -
Step 1206 performs an action in response to the user input. Theink module 118, for instance, causes an action to be performed based on which control is selected by the user. Examples of such actions include changing an ink mode, initiating ink playback, applying different ink formatting, and so forth. - The next portion of this discussion presents example implementation scenarios and an example procedure for ink for selection in accordance with various implementations. Generally, ink for selection provides ways of selecting and processing content using ink in various ways.
-
FIG. 13 depicts anexample implementation scenario 1300 for ink for selection in accordance with one or more implementations. Thescenario 1300, for example, represents a continuation of the scenarios described above. The upper portion of thescenario 1300 includes aGUI 1302 with adocument 1304 and the ink menu 802 (introduced above) displayed on thedisplay 110. In this particular example, thedocument 1304 represents a web page. It is to be appreciated, however, that techniques discussed herein may utilize a wide variety of other types of documents and content. - In the upper portion of the
scenario 1300, a user manipulates thepen 124 to apply aselection gesture 1306 within thedocument 1304, e.g., to a surface of thedisplay 110 within thedocument 1304. Theselection gesture 1306, for instance, is applied in an ink selection mode. A selection mode may be activated in various ways, such as in response to selection of theselection mode control 814, in response to selection of thepen mode button 126, and so forth. For instance, pressing and/or holding thepen mode button 126 activates a selection mode that causes gestures (e.g., ink gestures) to be interpreted according to the ink selection mode. - In at least some implementations, applying the
selection gesture 1306 causes corresponding ink to be applied along the path of the selection gesture such that a visual indication of theselection gesture 1306 is displayed. Alternatively, no visual indication of theselection gesture 1306 is displayed. - Notice that as the user applies the
selection gesture 1306, aselection shape 1308 is automatically generated within thedocument 1304. Theink module 118, for example, detects that theselection gesture 1306 is applied in the ink selection mode, and in response causes theselection shape 1308 to be automatically generated. - According to various implementations, the shape and size of the
selection shape 1308 is based on attributes of theselection gesture 1306. In this particular example, theselection gesture 1306 is a straight line that is horizontal relative to the visual orientation of thedocument 1304. Accordingly, theselection shape 1308 is drawn as a square, with the size of the square being based on a length of theselection gesture 1306. For instance, theselection shape 1308 starts at astart point 1310 of theselection gesture 1306 and expands outwardly from thecenter point 1310 as theselection gesture 1306 increases in length. Thus, in this particular example thestart point 1310 represents a center of theselection shape 1308. Further, the length of theselection gesture 1306 represents one-half the length of a side of theselection shape 1308, and this size relationship is maintained as theselection gesture 1306 changes in length. - Proceeding to the lower portion of the
scenario 1300, theselection gesture 1306 increases in length, and thus theselection shape 1308 increases in size. As illustrated, theselection shape 1308 increases in size to encompasscontent 1312 displayed as part of thedocument 1304. In this particular example, thecontent 1312 represents a web page object. Thescenario 1300 then proceeds to ascenario 1400. -
FIG. 14 depicts anexample implementation scenario 1400 for ink for selection in accordance with one or more implementations. Thescenario 1300, for example, represents a continuation of thescenario 1300 described above. The upper portion of thescenario 1400 includes theGUI 1302 with the document 1304 (introduced above) displayed on thedisplay 110. - In the upper portion of the
scenario 1400, a selection release event is detected. For instance, the user lifts thepen 124 such that thepen 124 is not detected in proximity to the surface of thedisplay 110. Alternatively or additionally, the user selects or releases thepen mode button 126. Accordingly, and proceeding to the lower portion of thescenario 1400, theselection shape 1308 is converted into aselection 1402 of content within theselection shape 1308. In this particular example, theselection 1402 represents a selection of thecontent 1312. Theink module 118, for instance, detects the selection release event and in response, automatically converts theselection shape 1308 into theselection 1402. - According implementations discussed herein, various actions may be performed utilizing the
selection 1402. For instance, thecontent 1312 may be copied, pasted, saved, shared, populated to an ink note (discussed below), and so forth. Examples of different actions that can be performed utilizing selected content are detailed throughout this disclosure. -
FIG. 15 depicts anexample implementation scenario 1500 for ink for selection in accordance with one or more implementations. Thescenario 1500, for example, represents a continuation and/or variation of the scenarios described above. The upper portion of thescenario 1500 includes theGUI 1302 with thedocument 1304 displayed on thedisplay 110. Further included is anink flag 1502 that includes a plus sign (“+”) indicating that an ink selection mode is currently active. The plus sign, for instance, corresponds to a visual presented for theselection mode control 814, discussed above with reference to theink menu 802. Thus, theink flag 1502 provides a visual affordance indicating that the ink selection mode is active. Although not expressly illustrated here, if a user were to hover thepen 124 over the surface of thedisplay 110, the plus sign would be displayed on thedisplay 110 as a hover target beneath the tip of thepen 124. - In the upper portion of the
scenario 1500, a user manipulates thepen 124 to apply aselection gesture 1504 within thedocument 1304. Theselection gesture 1504, for instance, is applied in an ink selection mode. A selection mode may be activated in various ways, examples of which are discussed elsewhere herein. In at least some implementations, applying theselection gesture 1504 causes corresponding ink to be applied along the path of the selection gesture such that a visual indication of theselection gesture 1504 is displayed. Alternatively, no visual indication of theselection gesture 1504 is displayed. - Notice that as the user applies the
selection gesture 1504, aselection shape 1506 is automatically generated within thedocument 1304. Theink module 118, for example, detects that theselection gesture 1504 is applied in the ink selection mode, and in response causes theselection shape 1506 to be automatically generated. - According to various implementations, the shape and size of the
selection shape 1506 is based on attributes of theselection gesture 1504. In this particular example, theselection gesture 1504 is a straight line that is diagonal relative to the visual orientation of thedocument 1304. Accordingly, theselection shape 1506 is drawn as a rectangle, with the size of the rectangle being based on a length of theselection gesture 1504. For instance, theselection shape 1506 starts at astart point 1508 of theselection gesture 1504 and expands outwardly from thestart point 1508 as theselection gesture 1504 increases in length. Thus, in this particular example thestart point 1508 represents a corner of theselection shape 1506. Further, the length of theselection gesture 1504 represents a diagonal of the selection shape 1506 (e.g., a diagonal of a rectangle), and this size relationship is maintained as theselection gesture 1504 changes in length. - Proceeding to the lower portion of the
scenario 1500, theselection gesture 1504 increases in length, and thus theselection shape 1506 increases in size. As illustrated, theselection shape 1506 increases in size to encompass thecontent 1312 andcontent 1510 displayed as part of thedocument 1304. Thescenario 1500 then proceeds to ascenario 1600. -
FIG. 16 depicts anexample implementation scenario 1600 for ink for selection in accordance with one or more implementations. Thescenario 1600, for example, represents a continuation of thescenario 1500 described above. The upper portion of thescenario 1600 includes theGUI 1302 with thedocument 1304 displayed on thedisplay 110. - In the upper portion of the
scenario 1600, a selection release event is detected. For instance, the user lifts thepen 124 such that thepen 124 is not detected in proximity to the surface of thedisplay 110. Alternatively or additionally, the user selects or releases thepen mode button 126. Accordingly, and proceeding to the lower portion of thescenario 1600, theselection shape 1506 is converted into aselection 1602 of content within theselection shape 1506. In this particular example, theselection 1602 represents a selection of thecontent 1312 and thecontent 1510. Theink module 118, for instance, detects the selection release event and in response, automatically converts theselection shape 1506 into theselection 1602. - According implementations discussed herein, various actions may be performed utilizing the
selection 1602. For instance, thecontent - As an example in addition to the scenarios described above, consider a further scenario where a user applies a vertical selection gesture (e.g., relative to the display 110) in the
document 1304 while in an ink selection mode. In response to the vertical selection gesture, a selection shape may be drawn as a circle. For instance, the center of the circle corresponds to a start point of the vertical selection gesture, and the length of the vertical selection gesture corresponds to a radius of the circle. - Thus, the scenarios 1300-1600 illustrate that techniques discussed herein enable selection shapes to be drawn based on attributes of a selection gesture. For instance, selection gestures applied as lines in different directions (e.g., horizontal, diagonal, vertical, and so forth) cause different respective types of selection shapes to be drawn. The examples discussed herein are provided for purpose of illustration only, and it is to be appreciated that implementations discussed herein cover a wide variety of other selection shapes and relationships between selection gestures and selection shapes.
-
FIG. 17 depicts anexample implementation scenario 1700 for ink for selection in accordance with one or more implementations. Thescenario 1700, for example, represents a continuation and/or variation of the scenarios described above. The upper portion of thescenario 1500 includes theGUI 1302 with thedocument 1304 displayed on thedisplay 110. Further included is theink flag 1502 indicating that an ink selection mode is currently active. - In the upper portion of the
scenario 1700, a user manipulates thepen 124 to apply aselection gesture 1702 within thedocument 1304. Theselection gesture 1702, for instance, is applied in an ink selection mode. A selection mode may be activated in various ways, examples of which are discussed elsewhere herein. Notice in this particular example, theselection gesture 1702 is non-linear, i.e., is not a straight line. According to various implementations, a non-linear selection gesture is associated with different selection behaviors than linear selection gesture, such as those described above. - In at least some implementations, applying the
selection gesture 1702 causes corresponding ink to be applied along the path of the selection gesture such that a visual indication of theselection gesture 1702 is displayed. Alternatively, no visual indication of theselection gesture 1702 is displayed. - Proceeding to the lower portion of the
scenario 1700, a selection release event is detected. For instance, the user lifts thepen 124 such that thepen 124 is not detected in proximity to the surface of thedisplay 110. Alternatively or additionally, the user selects or releases thepen mode button 126. Accordingly, in response to the selection release event, theink module 118 causes an auto-complete line 1704 to be drawn between astart point 1706 and anend point 1708 of theselection gesture 1702. The auto-complete line 1704 closes theselection gesture 1702 to generate a closed shape around thecontent 1312. - While the
selection gesture 1702 and the auto-complete line 1704 are depicted as being displayed on thedisplay 110, it is to be appreciated that in at least some implementations, theselection gesture 1702 and the auto-complete line 1704 are not displayed but represent depictions of underlying selection data tracked by theink module 118. Thescenario 1700 then proceeds to ascenario 1800. -
FIG. 18 depicts anexample implementation scenario 1800 for ink for selection in accordance with one or more implementations. Thescenario 1800, for example, represents a continuation of thescenario 1700 described above. The upper portion of thescenario 1800 includes theGUI 1302 with thedocument 1304 displayed on thedisplay 110. Further illustrated is theclosed selection gesture 1702 including the auto-complete line 1704 around thecontent 1312, such as depicted in the lower portion of thescenario 1700. - Proceeding to the lower portion of the
scenario 1800, theclosed selection gesture 1702 is converted into aselection 1802 of thecontent 1312. According to various implementations, theink module 118 detects the release of theselection gesture 1702 and in response, automatically generates the auto-complete line 1704 and converts theclosed selection gesture 1702 into theselection 1802 independent of user input. - According implementations discussed herein, various actions may be performed utilizing the
selection 1802. For instance, thecontent 1312 may be copied, pasted, saved, shared, populated to an ink note (discussed below), and so forth. Examples of different actions that can be performed utilizing selected content are detailed throughout this disclosure. - Thus, the example scenarios presented above illustrate different selection gestures representing an open gestures, such as a line, an open curve, and so forth. Further, the open gestures are automatically converted to corresponding closed selection shapes, such as rectangles, circles, closed curves, and so forth.
-
FIG. 19 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for ink for selection in accordance with one or more implementations. In at least some implementations, the method represents an extension of the methods described above. -
Step 1900 detects pen input drawing a selection gesture. Theink module 118, for example, detects that a user applies a selection gesture to an input surface while in an ink selection mode. In at least some implementations, a visual representation of the selection gesture is presented using ink. Alternatively, no visual representation of the selection gesture is presented. Generally, the selection gesture corresponds to an open gesture, such as a straight line, an open curve, and so forth. For instance, a start point and an end point of the selection gesture do not coincide, and the selection gesture does not intersect itself. -
Step 1902 generates a selection shape based on a direction of the selection gesture. Generally, the selection shape corresponds to a closed shape, such as a square, a rectangle, a closed curve (e.g., a circle), an irregular closed shape, and so forth. As illustrated in the scenarios described above, the shape of the selection shape depends on an orientation in which the selection gesture is applied relative to a document in which the selection gesture is applied. Further, the size of the selection shape is determined based on the length of the selection gesture. For instance, the size of the selection shape increases with an increase in length of the selection gesture. - In at least some implementations, generating a selection shape includes generating an auto-complete line between opposite ends of the selection gesture. For instance, if the selection gesture is an open curve (e.g., an arc), an auto-complete line is automatically drawn between opposite ends of the open curve to generate a closed curve.
-
Step 1904 ascertains a release event for the selection gesture. The release event, for instance, corresponds to a user removing the pen from proximity to the display surface. Alternatively or additionally, the release event represents the user releasing thepen mode control 126. -
Step 1906 causes an automatic selection of content within the selection shape responsive to the release event. For example, theink module 118 detects the release event and automatically converts the selection shape into a selection of content encompassed by the selection shape. -
Step 1908 causes an action to be performed utilizing the selected content. Theink module 118, for example, causes the selected content to be copied, pasted, shared, populated to an ink note, and so forth. In at least some implementations, theink module 118 causes the selected content to be propagated to another functionality, such as anapplication 106. Examples of other actions that can be applied to selected content are detailed elsewhere herein. - The next portion of this discussion presents example implementation scenarios and an example procedure for ink notes in accordance with various implementations. Generally, ink notes provide ways of preserving ink as notes that can be saved, shared, and accessed in various ways.
-
FIG. 20 depicts anexample implementation scenario 2000 for ink notes in accordance with one or more implementations. Thescenario 2000, for example, represents a continuation of the scenarios described above. The upper portion of thescenario 2000 includes theGUI 302 with thedocument 304 and the ink menu 802 (introduced above) displayed on thedisplay 110. - In the upper portion of the
scenario 2000, a user selects theink note control 822. For instance, the user taps theink note control 822, and/or drags theink note control 822 from theink menu 802 into the body of thedocument 304. - Proceeding to the lower portion of the
scenario 2000, and responsive to the user selection of theink note control 822, anink note 2002 is presented in theGUI 302. Generally, theink note 2002 represents an electronic canvas on which notes can be applied using ink. A user then appliesink content 2004 to theink note 2002. - In at least some implementations, the
scenario 2000 occurs while theGUI 302 is in a transient ink mode. Accordingly, ink content applied to thedocument 304 itself will behave according to the transient ink mode. However, ink content applied within theink note 2002 behaves according to an ink note mode. Thus, theink note 2002 represents a separate inking environment from thedocument 304, and thus different behaviors apply to theink content 2004 than to ink content within thedocument 304. - The
ink note 2002 includes asave control 2006 and ashare control 2008. According to various implementations, selecting thesave control 2006 causes theink content 2004 to be saved to a particular location, such as a pre-specified data storage location. In at least some implementations, a single selection of thesave control 2006 causes theink content 2004 to be saved and theink note 2002 to be removed from display such that a user may return to interacting with thedocument 304. - The
share control 2008 is selectable to share theink content 2004, such as with another user. For instance, selecting theshare control 2008 causes theink content 2004 to be automatically propagated to a message, such as the body of an email message, an instant message, a text message, and so forth. A user may then address and send the message to one or more users. Alternatively or additionally, selecting theshare control 2008 may cause theink content 2004 to be posted to a web-based venue, such as a social networking service, a blog, a website, and so forth. According to various implementations, functionality of theshare control 2008 is user configurable such that a user may specify behaviors caused by selection of theshare control 2008. -
FIG. 21 depicts anexample implementation scenario 2100 for ink notes in accordance with one or more implementations. Thescenario 2100, for example, represents a continuation and/or variation of the scenarios described above. The upper portion of thescenario 2100 includes theGUI 302 with thedocument 304 and the ink menu 802 (introduced above) displayed on thedisplay 110. - In the upper portion of the
scenario 2100, a user appliesink content 2102 to thedocument 302 and then applies aselection action 2104 to theink content 2102. In this particular example, theselection action 2104 is implemented as inking a closed loop around theink content 2102. Other selection actions may be utilized, however, such as techniques for ink for selection described above. - Proceeding to the lower portion of the
scenario 2100, the user selects theink note control 822, such as by tapping on theink note control 822 and/or dragging theink note control 822 out of theink menu 802. In response, anink note 2106 is automatically generated and populated with theink content 2102. Theink module 118, for example, detects that theink content 2102 is selected via theselection action 2104, and thus populates theink content 2102 to theink note 2106. Thus, theselection action 2104 followed by the selection of theink note control 822 is interpreted as a command to generate theink note 2106 and populate theink note 2106 with theink content 2102. - In this particular example, the ink content is moved (e.g., cut and paste) from the body of the
document 304 into theink note 2106. In some alternative implementations, the ink content is copied into theink note 2106 such that theink content 2102 remains in the body of thedocument 304. The user may then save theink note 2106 by selecting thesave control 2006, and may share theink content 2102 by selecting theshare control 2008. Example attributes and actions of thesave control 2006 and theshare control 2008 are described above. - While the
scenario 2100 is discussed with reference to populating theink content 2102 to theink note 2106, it is to be appreciated that a wide variety of other content may be populated to theink note 2106. For instance, a user may select a portion of theprimary content 306 from the document 304 (e.g., text content), and a subsequent selection of theink note control 822 would cause theink note 2102 to be generated and populated with the selected primary content. As another example implementation, a combination of ink content and primary content can be selected, and a subsequent selection of the ink note control would cause theink note 2102 to be generated and populated with both the selected ink content and primary content. - In at least some implementations, selection of content to be populated to an ink note is performed utilizing techniques for ink for selection described above. Thus, populating selected content to an ink note as described in this section represents an action that can be performed utilizing selected content, as described above with reference to step 1908 of
FIG. 19 . - The scenarios described above generally describe that ink note functionality is invocable via selection of the
ink note control 822. In additional or alternative implementations, ink note functionality is invocable in other ways, such as in response to a dragging gesture from anywhere within theink menu 802 into the body of thedocument 304, a custom gesture applied anywhere within theGUI 302, a gesture involving a combination of finger touch input and pen input, a voice command, a touchless gesture, and so forth. - Thus, the
scenario 2100 illustrates that techniques discussed herein reduce a number of user interactions required to propagate content to a note (e.g., an ink note), since a user may simply select existing content and invoke an ink note functionality to populate the existing content to an ink note. -
FIG. 22 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for generating an ink note in accordance with one or more implementations. In at least some implementations, the method represents an extension of the methods described above. -
Step 2200 detects a user selection of content. Theink module 118, for example, ascertains that a user selects a portion of ink content, primary content, combinations thereof, and so forth. In at least some implementations, content may be selected via techniques for ink for selection, described above. -
Step 2202 ascertains that an ink note functionality is invoked. Various ways of invoking ink note functionality are detailed above. -
Step 2204 populates the selected content to an ink note. For example, theink module 118 generates an ink note and populates (e.g., copies or moves) the selected content to the ink note. In at least some implementations, the ink note is generated and the selected content is populated to the ink note automatically and in response to a single user invocation of ink note functionality, e.g., a single user action. -
Step 2206 performs an action in relation to the ink note in response to user input. For instance, a user provides input that causes the ink note to be saved, to be shared, to be deleted, and so forth. Examples of user input include user selection of a selectable control, a user applying an ink and/or touch gesture, input via an input mechanism 112, and so forth. - The next portion of this discussion presents example implementation scenarios and example procedures for ink for commanding in accordance with various implementations. Generally, ink for commanding provides ways of causing various commands to be performed in response to ink input.
-
FIG. 23 depicts anexample implementation scenario 2300 for ink for commanding in accordance with one or more implementations. Thescenario 2300, for example, represents a continuation of the scenarios described above. The upper portion of thescenario 2300 includes theGUI 1302 with thedocument 1304 and the ink menu 802 (introduced above) displayed on thedisplay 110. - In the upper portion of the
scenario 2300, thecontent 1312 is selected as aselection 2302, such as using techniques for ink for selection described above. Further, notice that thepen 124 is hovered above the surface of thedisplay 110 and that a hovertarget 2304 is displayed beneath the tip of thepen 124. The hovertarget 2304 includes the visual icon presented for thecommand control 818 of theink menu 802, thus providing a visual affordance that a command mode is currently active. According to various implementations, the command mode can be activated in various ways, such as in response to a user selection of thecommand control 818, a user selection of thepen mode button 126, and so forth. - Proceeding to the lower portion of the
scenario 2300, a user applies ink within theselection 2302 to write acommand 2306. In this particular example, thecommand 2306 includes the term “Reminder,” which is interpreted in the command mode as a command to generate a reminder based on thecontent 1312 included in theselection 2302. Thescenario 2300 then proceeds to ascenario 2400. -
FIG. 24 depicts anexample implementation scenario 2400 for ink for commanding in accordance with one or more implementations. Thescenario 2400, for example, represents a continuation of thescenario 2300 described above. The upper portion of thescenario 2400 includes theGUI 1302 with the document 1304 (introduced above) displayed on thedisplay 110. Further included is thecontent 1312 selected via theselection 2302, and thecommand 2306 inked within theselection 2302. As referenced above, thecommand 2306 is recognized (e.g., by the ink module 118) as a command to generate a reminder based on information from the selectedcontent 1312. - In the upper portion of the
scenario 2400, a release event is detected, examples of which are discussed above. In response to the release event and recognition of thecommand 2306, and proceeding to the lower portion of thescenario 2400, data from thecontent 1312 is propagated to acalendar 2402 to generate acalendar event 2404. Thecalendar 2402, for instance, represents a GUI for a calendar application that represents an instance of theapplications 106. - As illustrated, the
content 1312 presents information about an upcoming event. Thus, information about the upcoming event is ascertained from thecontent 1312, and utilized to generate thecalendar event 2404. For instance, theink module 118 recognizes that characters displayed as part of thecontent 1312 are particular words and phrases that have a particular meaning (e.g., date, time, location, etc.), and thus are used to populate relevant fields of thecalendar event 2404. Alternatively or additionally, metadata for thecontent 1312 is accessed to ascertain information about thecontent 1312. These implementations are presented for purpose of example only, and information about the visual object may be ascertained in a variety of different ways. - Adjacent to and/or overlaid on the
calendar 2402 is anink flag 2406 including a commanding icon. Generally, theink flag 2406 with the commanding icon present a visual affordance that a commanding mode is active such that ink input will be interpreted according to the commanding mode. - According to implementations discussed herein, the
calendar event 2404 is generated automatically and in response to the user selecting thecontent 1312 and writing thecommand 2306. For instance, no further user input is required after writing thecommand 2306 for thecalendar event 2404 to be generated. In response to thecommand 2306 being written, for example, thecalendar 2402 is automatically launched and thecalendar event 2404 is generated independent of user input. -
FIG. 25 depicts anexample implementation scenario 2500 for ink for commanding in accordance with one or more implementations. Thescenario 2500, for example, represents a continuation and/or variation of the scenarios described above. The upper portion of thescenario 2500 includes theGUI 1302 with the document 1304 (introduced above) displayed on thedisplay 110. - In the upper portion of the
scenario 2500, thecontent 1312 is selected as aselection 2502, such as using techniques for ink for selection described above. Further, a user applies ink within theselection 2502 to enter acommand 2504 while in a command mode. In this particular example, thecommand 2504 includes the phrase “email to John Smith.” - In the upper portion of the
scenario 2500, a release event is detected, examples of which are discussed above. In response to the release event and recognition of thecommand 2504, and proceeding to the lower portion of thescenario 2500, anemail message 2506 is generated and populated with information from thecontent 1312. For example, theink module 118 parses thecommand 2504 and recognizes that the term “email” represents a command to generate an email message with selected content of thecontent 1312. Theink module 118 further recognizes that the term “to John Smith” represents a recipient of the email. Thus, theink module 118 communicates this information to an email functionality. Theemail message 2506, for instance, is generated by an email application that represents an instance of theapplications 106. For instance, an email address for John Smith is included in contact information for the user, and is thus retrieved by the email application and used to address the email message. - As illustrated, the
content 1312 presents information about an upcoming event. Thus, information about the upcoming event is ascertained from thecontent 1312, and utilized to populate theemail message 2506. Example ways of ascertaining information from thecontent 1312 are discussed above. - In at least some implementations, the
email message 2506 is automatically generated, populated with information from thecontent 1312, and sent to the recipient without any further user input after entering thecommand 2504. - Alternatively, after entry of the
command 2504, theemail message 2506 is automatically generated and populated with information from thecontent 1312. The user is then given the opportunity to view and edit theemail message 2506 prior to sending theemail message 2506. - While the scenarios 2300-2500 are discussed from the perspective of a selection occurring before a command being entered, it is to be appreciated that the temporal relationship between object selection and commanding may be arranged in various other ways. For instance, a user may first apply ink to write a command, and may subsequently select an object on which the command is to be performed. With reference to the
scenario 2500, for example, the user may first write thecommand 2504 and then subsequently select thecontent 1312 to cause thecommand 2504 to be performed as depicted in thescenario 2500. - Further, it is not required that a command be inked within a selected object. With reference to the
scenario 2500, for example, thecommand 2504 may be written anywhere within thedocument 1304 and outside of the selection of thecontent 1312. In such a scenario, theink module 118 would recognize that thecontent 1312 is selected and that thecommand 2504 is applied, and would cause thecommand 2504 to be performed utilizing the selectedcontent 1312. In a commanding mode, for instance, a command and a selection are linked whatever the visual and/or temporal relationship is between the command and the selection. -
FIG. 26 depicts anexample implementation scenario 2600 for ink for commanding in accordance with one or more implementations. Thescenario 2600, for example, represents a continuation and/or variation of the scenarios described above. The upper portion of thescenario 2600 includes theGUI 1302 with the document 1304 (introduced above) displayed on thedisplay 110. Adjacent to and/or overlaid on thedocument 1304 is anink flag 2602 including a commanding icon, indicating that a commanding mode is active. - Further depicted in the upper portion of the
scenario 2600 is that the user brings thepen 124 in proximity to acommand region 2604 of theGUI 1302. Thecommand region 2604, for instance, represents a pre-specified portion of theGUI 1302 that is associated with commanding mode functionality. - Proceeding to the lower portion of the
scenario 2600 and in response to detecting thepen 124 in proximity to thecommand region 2604, acommand field 2606 is presented which includes a prompt 2608 for user input. The prompt 2608, for instance, prompts the user to input a command into thecommand field 2606. Accordingly, the user enters acommand 2610 into thecommand field 2608. In this particular example, thecommand 2610 includes an instruction to search for weather on a particular date. Thescenario 2600 then proceeds to ascenario 2700. -
FIG. 27 depicts anexample implementation scenario 2700 for ink for commanding in accordance with one or more implementations. Thescenario 2700, for example, represents a continuation of thescenario 2600 described above. The upper portion of thescenario 2700 includes theGUI 1302 with the document 1304 (introduced above) displayed on thedisplay 110. Further included is the command field with the command 2612. In the upper portion of thescenario 2700, the user removes thepen 124 from proximity to thedisplay 110 such that a release event is generated. - Proceeding to the lower portion of the
scenario 2700 and in response to detection of the release event, the command 2612 is executed such thatcommand results 2702 are presented on thedisplay 110. The command 2612, for instance, is submitted as a set of search terms to a web search engine, which performs a search using the search terms and returns the command results 2702. The command results 2702 include weather information for the particular date that is retrieved and displayed on thedisplay 110. Thedocument 1304, for instance, is replaced in thedisplay 110 with the weather information. According to various implementations, thecommand results 2702 are retrieved and displayed automatically and in response to the user entering the command 2612 and removing thepen 124 from proximity to thedisplay 110. For example, thecommand results 2702 are retrieved and displayed without any further user input after entering the command 2612. - Accordingly, the
scenarios ink module 118 can forward commands toappropriate applications 106 to be recognized and/or performed. - Thus, the scenarios 2300-2700 illustrate that techniques discussed herein can be utilized to recognize various commands and to perform various actions based on the commands. The commands discussed in these scenarios are presented for purpose of example only, and it is to be appreciated that a wide variety of other commands not expressly discussed herein may be employed in accordance with techniques discussed herein.
-
FIG. 28 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for ink for commanding in accordance with one or more implementations. For example, the method describes an example way of performing one or more of the implementation scenarios described above. In at least some implementations, the method represents an extension of the methods described above. -
Step 2800 detects a selection of content. Theink module 118, for instance, detects that the content is selected via input from thepen 124. In at least some implementations, the content is selected via techniques for ink for selection examples of which are described above. -
Step 2802 ascertains input of a command via freehand input from a pen. For example, theink module 118 detects that a command is applied via freehand ink input from thepen 124. Example ways of providing and detecting a command are described above. -
Step 2804 causes the command to be executed using the content. Theink module 118, for instance, causes an action to be performed utilizing the content and based on the command. In at least some implementations, theink module 118 communicates the command and the content and/or attributes of the content to anapplication 106 to cause theapplication 106 to perform the command. - Examples of different actions that can be performed based on a command are described above with reference to the scenarios 2300-2700, such as generating a calendar event and/or an email based on content, performing a search (e.g., a web search) based on a command, and so forth. Examples of other actions that can be performed based on a command include sharing selected content to a website (e.g., a social networking site), propagating selected content to a content editing application for editing, saving selected content as a content file, and so forth. Thus, a wide variety of different commands can be interpreted to enable a wide variety of different actions to be performed. In at least some implementations, causing a command to be executed includes causing a visual output of the command to be displayed.
-
FIG. 29 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for ink for commanding in accordance with one or more implementations. For example, the method describes an example way of performing one or more of the implementation scenarios described above. In at least some implementations, the method represents an extension of the methods described above. -
Step 2900 detects a pen in proximity to a designated command region of an input surface. Theink module 118, for example, detects that thepen 124 is in proximity to a command region of thedisplay 110. Generally, the command region represents a pre-specified region of a display and/or a GUI that is associated with ink for commanding functionality. -
Step 2902 causes a command field to be presented. The command field generally represents a GUI region in which commands can be entered. For instance, theink module 118 causes the command field to be displayed automatically and in response to detecting thepen 124 in proximity to the command region. In at least some implementations, the command field is presented in response to a hover of thepen 124 over the input surface and prior to thepen 124 touching the input surface. Thus, the command field may be presented independent of a user selection to invoke the command field. -
Step 2904 ascertains user input of a command to the command field. Theink module 118, for example, detects a command term and/or phrase entered into the command field. In at least some implementations, natural language processing may be employed to parse a command and correlate command terminology to particular machine-based commands. - Step 2906 causes the command to be performed. For instance, the
ink module 118 performs one or more aspects of the command. Alternatively or additionally, theink module 118 forwards the command to anapplication 106 to cause theapplication 106 to perform (e.g., execute) the command. In at least some implementations, causing the command to be performed causes a visual display of command results. Alternatively or additionally, causing a command to be performed causes a reconfiguration of a device state (e.g., of the client device 102), such as to change device settings, to change application state, to mitigate errors and/or device malfunctioning, and so forth. - According to one or more implementations, a command is performed utilizing selected content, such as content selected via techniques for ink for selection described above.
- The next portion of this discussion presents example implementation scenarios and example procedures for ink for recognition in accordance with various implementations. Generally, ink for recognition provides ways of converting ink input into machine-coded characters and shapes.
-
FIG. 30 depicts anexample implementation scenario 3000 for ink for shape recognition in accordance with one or more implementations. Thescenario 3000, for example, represents a continuation of the scenarios described above. The upper portion of thescenario 3000 includes aGUI 3002 with adocument 3004 and the ink menu 802 (introduced above) displayed on thedisplay 110. Displayed within thedocument 3004 isprimary content 3006, which in this example includes a map of geographic regions. - In the upper portion of the
scenario 3000, a user manipulates thepen 124 to apply ink to draw afreehand shape 3008 a and afreehand shape 3008 b, which in this example are circles. Further, thefreehand shapes freehand shapes target 3010 is displayed beneath the tip of thepen 124, providing a visual affordance that the shape recognition mode is currently active. - Proceeding to the lower portion of the
scenario 3000, thefreehand shapes freehand shapes shapes document 3004. According to various implementations, the encodedshapes - Thus, the
scenario 3000 illustrates that in a shape recognition mode, shapes drawn in freehand using ink input can be recognized and converted to corresponding machine-encoded shapes. While thescenario 3000 is discussed with reference to recognition of freehand circles, it is to be appreciated that a wide variety of other freehand shapes may be recognized and converted to machine-encoded shapes, such as triangles, rectangles, parallelograms, irregular shapes, and so forth. -
FIG. 31 depicts anexample implementation scenario 3100 for ink for text recognition in accordance with one or more implementations. Thescenario 3100, for example, represents a continuation of the scenarios described above. The upper portion of thescenario 3100 includes aGUI 3102 and the ink menu 802 (introduced above) displayed on thedisplay 110. In this particular example, theGUI 3102 represents a GUI for an email application. Depicted within theGUI 3102 is anemail inbox 3104 and anemail message 3106. - In the upper portion of the
scenario 3100, a user taps thepen 124 within the body of theemail message 3106, which causes a text prompt 3108 to be displayed. Thetext prompt 3108, for instance, represents a visual affordance that a text recognition mode is active, and that text applied using ink will be recognized and positioned starting at thetext prompt 3108. For example, prior to tapping thepen 124 within the body of theemail message 3106, the user activates a text recognition mode by selecting thetext recognition control 810 from theink menu 802. - Proceeding to the lower portion of the
scenario 3100, the user begins writingfreehand text 3110 with thepen 124 within the body of theemail message 3106 but in a different region than where thetext prompt 3108 is displayed. In response to detecting thepen 124 in proximity to the surface of thedisplay 110, atext guide 3112 is presented as a straight line adjacent to the tip of thepen 124. Generally, thetext guide 3112 provides visual assistance to enable a user to orient the characters of thefreehand text 3110. According to various implementations, proper orientation of freehand text increases the accuracy of ink to text recognition by creating a spacing construct that aids in recognition. -
FIG. 32 depicts anexample implementation scenario 3200 for ink for text recognition in accordance with one or more implementations. Thescenario 3200, for example, represents a continuation of thescenario 3100 described above. The upper portion of thescenario 3200 includes the GUI 3102 (introduced above) displayed on thedisplay 110. - In the upper portion of the
scenario 3200, portions of thefreehand text 3110 are recognized (e.g., via OCR) and converted to machine-encoded (“encoded”)text 3202. As depicted, the encodedtext 3202 is placed starting at the original position of the text prompt 3108 in theemail message 3106 as depicted in thescenario 3100. Further, the text prompt 3108 moves to indicate where newly recognized text will be presented. - Proceeding to the lower portion of the
scenario 3200, notice that portions of thefreehand text 3110 that are recognized and converted to the encodedtext 3202 are removed from display. Thus, the user may apply furtherfreehand text 3204 where the removed portions of freehand text were displayed, and/or in other portions of theGUI 3102. Accordingly, visual clutter of theGUI 3102 is reduced and the user is afforded space to enter the additionalfreehand text 3204. -
FIG. 33 depicts anexample implementation scenario 3300 for ink for text recognition in accordance with one or more implementations. Thescenario 3300, for example, represents a continuation of thescenarios scenario 3300 includes the GUI 3102 (introduced above) displayed on thedisplay 110. - In the
scenario 3300, thefreehand text scenarios text 3202. Further, the user removes thepen 124 from proximity to the surface of thedisplay 110. Accordingly, thetext guide 3112 presented in thescenarios GUI 3102, thetext guide 3112 would be redisplayed and the additional freehand text would be recognized and appended to the encodedtext 3202. - Thus, the scenarios 3100-3300 illustrate that ink may be applied for text recognition in any portion of a display. For instance, a user is not constrained to entering freehand text in a predefined region, but may simply enter the freehand text in any portion of a display and the freehand text will be converted to encoded text that is populated to a different region of the display. Accordingly, entry of freehand ink text is not constrained to a region in which a recognized and encoded version of the text will be displayed.
-
FIG. 34 depicts anexample implementation scenario 3400 for ink for text recognition in accordance with one or more implementations. Thescenario 3400, for example, represents a continuation of the scenarios described above. Thescenario 3400 includes the GUI 3102 (introduced above) displayed on thedisplay 110. - In the upper portion of the
scenario 3400, a user taps thepen 124 within theinbox 3104, which causes aselection 3402 of theinbox 3104. Further, notice that a hovertarget 3404 is presented as a letter “T,” indicating that a text recognition mode is currently active. - Proceeding to the lower portion of the
scenario 3400, the user entersfreehand text 3406 within the selectedinbox 3104. In this particular example, thefreehand text 3406 includes the letter “W.” Notice further that in response to detecting thepen 124 in proximity to thedisplay 110 within theinbox 3104, atext guide 3408 is presented adjacent to the tip of thepen 124. Thescenario 3400 then proceeds to ascenario 3500. -
FIG. 35 depicts anexample implementation scenario 3500 for ink for text recognition in accordance with one or more implementations. Thescenario 3500, for example, represents a continuation of the scenarios described above. Thescenario 3500 includes the GUI 3102 (introduced above) displayed on thedisplay 110. - In the upper portion of the
scenario 3500, after applying thefreehand text 3406, the user removes thepen 124 from proximity to thedisplay 110. Proceeding to the lower portion of thescenario 3500, thefreehand text 3406 is recognized as an encoded letter “W.” Thus, email messages listed in theinbox 3104 are processed (e.g., sorted, filtered, and so forth) based on the letter “W.” In this particular example, the email messages are rearranged in descending alphabetical order of sender (“from”) starting with the letter “W.” - Thus, the
scenarios scenarios - While the implementation scenarios presented above describe text and shape recognition modes separately, it is to be appreciated that implementations discussed herein support concurrently text and shape recognition. For instance, with reference to the
scenario 3000 described above, if a user were to apply freehand ink text characters to theGUI 3002, the freehand ink text would be recognized and converted into encoded text and displayed within theGUI 3002 along with the encodedshapes -
FIG. 36 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for ink for recognition in accordance with one or more implementations. For example, the method describes an example way of performing one or more of the implementation scenarios described above. In at least some implementations, the method represents an extension of the methods described above. -
Step 3600 detecting a pen in proximity to an input surface while the system is in an ink recognition mode. For instance, theink module 118 receives a notification that thepen 124 is in proximity to the surface of thedisplay 110, such as from thedisplay 110 itself. -
Step 3602 causes a visual affordance identifying the ink recognition mode to be displayed responsive to said detecting. For example, a hover target that identifies a shape recognition mode and/or a text recognition mode is displayed below and/or adjacent to thepen 124. Alternatively or additionally, an icon that represents a shape recognition mode and/or a text recognition mode is displayed as part of an ink flag and/or an ink menu. Examples of different hover targets and icons are discussed above. According to one or more implementations, the visual affordance is removed from display in response to detecting that the pen is removed from proximity to the input surface. -
Step 3604 processes ink content applied to the input surface according to the ink recognition mode. For instance, shapes applied using ink are converted to encoded shapes, and text applied using ink is converted to encoded text. Example ways of processing ink are discussed above and depicted in the accompanying figures. -
FIG. 37 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for ink for text recognition in accordance with one or more implementations. For example, the method describes an example way of performing one or more of the implementation scenarios described above. In at least some implementations, the method represents an extension of the methods described above. -
Step 3700 receives freehand pen input to a region of a display. Theink module 118, for example, ascertains that freehand pen input is applied within a region of a display. According to various implementations, the region of the display is not visually identified as a designated region for receiving pen input, e.g., is not visually distinguished from other regions of the display. For instance, a user may randomly select the region. -
Step 3702 converts the freehand pen input to encoded text. For example, the freehand pen input includes characters that are recognized as particular text characters, and that are converted into encoded text characters. -
Step 3704 populates the encoded text to a different region of the display. The encoded text, for instance, is displayed in a different region of the display than the region in which the freehand pen input was provided. -
FIG. 38 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method, for instance, describes an example procedure for ink for character recognition in accordance with one or more implementations. For example, the method describes an example way of performing one or more of the implementation scenarios described above. In at least some implementations, the method represents an extension of the methods described above. -
Step 3800 detects a user selection of content. Theink module 118, for example, detects that a user selects a portion of content. With reference to the example scenarios above, theink module 118 detects that a user selects theinbox 3104. -
Step 3802 processes the content based on a character recognized from freehand pen input of a character. For instance, freehand ink input is recognized as a portion of text, e.g., one or more text characters, and selected content is processed based on the portion of text. Alternatively or additionally, the freehand input may include non-textual characters (e.g., shapes) that are recognized as having particular meanings. For instance, a square may correspond to a particular type of processing, a circle to a different type of processing, a triangle to yet another different type of processing, and so forth. Thus, textual and non-textual characters may be recognized to perform associated processing. Examples of processing the content using the portion of text include searching, filtering, sorting, and so forth, using the portion of text. - In at least some implementations, different portions of content are associated with different types of processing based on text. For instance, consider the email scenarios described with reference to
FIGS. 31-35 . In these scenarios, freehand pen input of a text character within theinbox 3104 is interpreted (e.g., by theink module 118 and/or an application 106) as a command utilize the text character to perform a specific type of processing on messages stored in theinbox 3104, such as searching, filtering, sorting, and so forth. Further, freehand pen input within theemail message 3106 is interpreted as a command to convert the freehand input into encoded text characters, and to populate the encoded text characters into theemail message 3106. - Although discussed separately, it is to be appreciated that the implementations, scenarios, and procedures described above can be combined and implemented together in various ways. For instance, the implementations, scenarios, and procedures describe different functionalities of single integrated inking platform, such as implemented by the
ink module 118. - Having described some example implementation scenarios and procedures for ink modes, consider now a discussion of an example system and device in accordance with one or more embodiments.
- Example System and Device
-
FIG. 39 illustrates an example system generally at 3900 that includes anexample computing device 3902 that is representative of one or more computing systems and/or devices that may implement various techniques described herein. For example, theclient device 102 discussed above with reference toFIG. 1 can be embodied as thecomputing device 3902. Thecomputing device 3902 may be, for example, a server of a service provider, a device associated with the client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system. - The
example computing device 3902 as illustrated includes aprocessing system 3904, one or more computer-readable media 3906, and one or more Input/Output (I/O) Interfaces 3908 that are communicatively coupled, one to another. Although not shown, thecomputing device 3902 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines. - The
processing system 3904 is representative of functionality to perform one or more operations using hardware. Accordingly, theprocessing system 3904 is illustrated as includinghardware element 3910 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. Thehardware elements 3910 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions. - The computer-readable media 3906 is illustrated as including memory/storage 3912. The memory/storage 3912 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage 3912 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage 3912 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 3906 may be configured in a variety of other ways as further described below.
- Input/output interface(s) 3908 are representative of functionality to allow a user to enter commands and information to
computing device 3902, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone (e.g., for voice recognition and/or spoken input), a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to detect movement that does not involve touch as gestures), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, thecomputing device 3902 may be configured in a variety of ways as further described below to support user interaction. - Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” “entity,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the
computing device 3902. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.” - “Computer-readable storage media” may refer to media and/or devices that enable persistent storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Computer-readable storage media do not include signals per se. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.
- “Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the
computing device 3902, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media. - As previously described,
hardware elements 3910 and computer-readable media 3906 are representative of instructions, modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein. Hardware elements may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware devices. In this context, a hardware element may operate as a processing device that performs program tasks defined by instructions, modules, and/or logic embodied by the hardware element as well as a hardware device utilized to store instructions for execution, e.g., the computer-readable storage media described previously. - Combinations of the foregoing may also be employed to implement various techniques and modules described herein. Accordingly, software, hardware, or program modules and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or
more hardware elements 3910. Thecomputing device 3902 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of modules that are executable by thecomputing device 3902 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/orhardware elements 3910 of the processing system. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one ormore computing devices 3902 and/or processing systems 3904) to implement techniques, modules, and examples described herein. - As further illustrated in
FIG. 39 , theexample system 3900 enables ubiquitous environments for a seamless user experience when running applications on a personal computer (PC), a television device, and/or a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on. - In the
example system 3900, multiple devices are interconnected through a central computing device. The central computing device may be local to the multiple devices or may be located remotely from the multiple devices. In one embodiment, the central computing device may be a cloud of one or more server computers that are connected to the multiple devices through a network, the Internet, or other data communication link. - In one embodiment, this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to a user of the multiple devices. Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices. In one embodiment, a class of target devices is created and experiences are tailored to the generic class of devices. A class of devices may be defined by physical features, types of usage, or other common characteristics of the devices.
- In various implementations, the
computing device 3902 may assume a variety of different configurations, such as forcomputer 3914, mobile 3916, andtelevision 3918 uses. Each of these configurations includes devices that may have generally different constructs and capabilities, and thus thecomputing device 3902 may be configured according to one or more of the different device classes. For instance, thecomputing device 3902 may be implemented as thecomputer 3914 class of a device that includes a personal computer, desktop computer, a multi-screen computer, laptop computer, netbook, and so on. - The
computing device 3902 may also be implemented as the mobile 3916 class of device that includes mobile devices, such as a mobile phone, portable music player, portable gaming device, a tablet computer, a wearable device, a multi-screen computer, and so on. Thecomputing device 3902 may also be implemented as thetelevision 3918 class of device that includes devices having or connected to generally larger screens in casual viewing environments. These devices include televisions, set-top boxes, gaming consoles, and so on. - The techniques described herein may be supported by these various configurations of the
computing device 3902 and are not limited to the specific examples of the techniques described herein. For example, functionalities discussed with reference to theclient device 102 and/orink module 118 may be implemented all or in part through use of a distributed system, such as over a “cloud” 3920 via aplatform 3922 as described below. - The
cloud 3920 includes and/or is representative of aplatform 3922 forresources 3924. Theplatform 3922 abstracts underlying functionality of hardware (e.g., servers) and software resources of thecloud 3920. Theresources 3924 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from thecomputing device 3902.Resources 3924 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network. - The
platform 3922 may abstract resources and functions to connect thecomputing device 3902 with other computing devices. Theplatform 3922 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for theresources 3924 that are implemented via theplatform 3922. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout thesystem 3900. For example, the functionality may be implemented in part on thecomputing device 3902 as well as via theplatform 3922 that abstracts the functionality of thecloud 3920. - Discussed herein are a number of methods that may be implemented to perform techniques discussed herein. Aspects of the methods may be implemented in hardware, firmware, or software, or a combination thereof. The methods are shown as a set of steps that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. Further, an operation shown with respect to a particular method may be combined and/or interchanged with an operation of a different method in accordance with one or more implementations. Aspects of the methods can be implemented via interaction between various entities discussed above with reference to the
environment 100. - Implementations discussed herein include:
- A system for selecting content based on pen input and causing an action to be performed on the selected content, the system including: an input surface; one or more processors; and one or more computer-readable storage media storing computer-executable instructions that, responsive to execution by the one or more processors, cause the system to perform operations including: detecting pen input from a pen to the input surface drawing an open selection gesture; generating a closed selection shape based on a particular direction of the selection gesture relative to the input surface; ascertaining a release event for the selection gesture; causing an automatic selection of content within the selection shape responsive to the release event; and causing an action to be performed utilizing the selected content.
- A system as described in example 1, wherein the selection gesture is different than the selection shape.
- A system as described in one or more of examples 1 or 2, wherein the selection gesture includes a single line, and wherein the selection shape includes a rectangle.
- A system as described in one or more of examples 1-3, wherein the selection gesture includes an open curve, and wherein the selection shape includes a closed curve.
- A system as described in one or more of examples 1-4, wherein the selection shape is generated concurrently with said drawing of the open selection gesture.
- A system as described in one or more of examples 1-5, wherein the release event includes one or more of the pen being removed from proximity to the input surface, or a selection of a pen button on the pen.
- A system as described in one or more of examples 1-6, wherein the selection gesture includes an open curve, and wherein the operations further include automatically drawing an auto-complete line between respective ends of the open curve to generate the selection shape as a closed curve.
- A system as described in one or more of examples 1-7, wherein the operations further include presenting a visual affordance that the system is in an ink selection mode, the visual affordance including one or more of an ink flag or a hover target.
- A computer-implemented method for causing a command to performed based on freehand input of the command, the method including: detecting a selection of content; ascertaining, by logic executed via a computing system, input of a command via freehand input of characters from a pen to an input surface of the computing system; and causing the command to be executed by the computing system using the content.
- A computer-implemented method as described in example 9, wherein the freehand input is applied at least partially within the selection.
- A computer-implemented method as described in one or more of examples 9 or 10, wherein the content is selected after the input of the command.
- A computer-implemented method as described in one or more of examples 9-11, wherein the freehand input includes one or more of a word or phrase that is recognized by the computing system as the command.
- A computer-implemented method as described in one or more of examples 9-12, wherein the command includes a command to generate a reminder, and wherein said causing the command to be executed includes causing a reminder to be generated using at least some of the content.
- A computer-implemented method as described in one or more of examples 9-13, wherein the command includes a command to share the content, and wherein said causing the command to be executed includes causing at least some of the content to be processed into a shareable form.
- A computer-implemented method processing ink input in an ink recognition mode, the method including: detecting a pen in proximity to an input surface of a computing system while the computing system is in an ink recognition mode; causing by the computing system a visual affordance identifying the ink recognition mode to be displayed responsive to said detecting; and processing by the computing system ink content applied to the input surface according to the ink recognition mode.
- A computer-implemented method as described in example 15, wherein the visual affordance includes one or more of an ink flag or a hover target.
- A computer-implemented method as described in one or more of examples 15 or 16, wherein the recognition mode includes a shape recognition mode, and there said processing includes processing the ink content to generate encoded non-text shapes.
- A computer-implemented method as described in one or more of examples 15-17, wherein the recognition mode includes a shape recognition mode, and there said processing includes processing the ink content to generate encoded non-text shapes.
- A computer-implemented method as described in one or more of examples 15-18, wherein the input surface includes a display, the ink content includes freehand input to a particular region of the display that is not visually identified as a designated region for receiving input, and wherein the method further includes: converting the freehand input into encoded text; and populating the encoded text to a different region of the display.
- A computer-implemented method as described in one or more of examples 15-19, wherein the input surface includes a display, the ink content includes freehand input of a character to a particular region of the display, and wherein the method further includes processing selected content based on a character recognized from the freehand input.
- Conclusion
- Techniques for ink modes are described. Although embodiments are described in language specific to structural features and/or methodological acts, it is to be understood that the embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed embodiments.
Claims (20)
1. A system comprising:
an input surface;
one or more processors; and
one or more computer-readable storage media storing computer-executable instructions that, responsive to execution by the one or more processors, cause the system to perform operations including:
detecting pen input from a pen to the input surface drawing an open selection gesture;
generating a closed selection shape based on a particular direction of the selection gesture relative to the input surface;
ascertaining a release event for the selection gesture;
causing an automatic selection of content within the selection shape responsive to the release event; and
causing an action to be performed utilizing the selected content.
2. The system as described in claim 1 , wherein the selection gesture is different than the selection shape.
3. The system as described in claim 1 , wherein the selection gesture comprises a single line, and wherein the selection shape comprises a rectangle.
4. The system as described in claim 1 , wherein the selection gesture comprises an open curve, and wherein the selection shape comprises a closed curve.
5. The system as described in claim 1 , wherein the selection shape is generated concurrently with said drawing of the open selection gesture.
6. The system as described in claim 1 , wherein the release event comprises one or more of the pen being removed from proximity to the input surface, or a selection of a pen button on the pen.
7. The system as described in claim 1 , wherein the selection gesture comprises an open curve, and wherein the operations further include automatically drawing an auto-complete line between respective ends of the open curve to generate the selection shape as a closed curve.
8. The system as described in claim 1 , wherein the operations further include presenting a visual affordance that the system is in an ink selection mode, the visual affordance comprising one or more of an ink flag or a hover target.
9. A computer-implemented method, comprising:
detecting a selection of content;
ascertaining, by logic executed via a computing system, input of a command via freehand input of characters from a pen to an input surface of the computing system; and
causing the command to be executed by the computing system using the content.
10. A computer-implemented method as recited in claim 9 , wherein the freehand input is applied at least partially within the selection.
11. A computer-implemented method as recited in claim 9 , wherein the content is selected after the input of the command.
12. A computer-implemented method as recited in claim 9 , wherein the freehand input comprises one or more of a word or phrase that is recognized by the computing system as the command.
13. A computer-implemented method as recited in claim 9 , wherein the command comprises a command to generate a reminder, and wherein said causing the command to be executed comprises causing a reminder to be generated using at least some of the content.
14. A computer-implemented method as recited in claim 9 , wherein the command comprises a command to share the content, and wherein said causing the command to be executed comprises causing at least some of the content to be processed into a shareable form.
15. A computer-implemented method, comprising:
detecting a pen in proximity to an input surface of a computing system while the computing system is in an ink recognition mode;
causing by the computing system a visual affordance identifying the ink recognition mode to be displayed responsive to said detecting; and
processing by the computing system ink content applied to the input surface according to the ink recognition mode.
16. A computer-implemented method as recited in claim 15 , wherein the visual affordance comprises one or more of an ink flag or a hover target.
17. A computer-implemented method as recited in claim 15 , wherein the recognition mode comprises a shape recognition mode, and there said processing comprises processing the ink content to generate encoded non-text shapes.
18. A computer-implemented method as recited in claim 15 , wherein the recognition mode comprises a shape recognition mode, and there said processing comprises processing the ink content to generate encoded non-text shapes.
19. A computer-implemented method as recited claim 15 , wherein the input surface comprises a display, the ink content comprises freehand input to a particular region of the display that is not visually identified as a designated region for receiving input, and wherein the method further comprises:
converting the freehand input into encoded text; and
populating the encoded text to a different region of the display.
20. A computer-implemented method as recited in claim 15 , wherein the input surface comprises a display, the ink content comprises freehand input of a character to a particular region of the display, and wherein the method further comprises processing selected content based on a character recognized from the freehand input.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/665,330 US20150338939A1 (en) | 2014-05-23 | 2015-03-23 | Ink Modes |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462002648P | 2014-05-23 | 2014-05-23 | |
US14/665,330 US20150338939A1 (en) | 2014-05-23 | 2015-03-23 | Ink Modes |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150338939A1 true US20150338939A1 (en) | 2015-11-26 |
Family
ID=54556059
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/665,282 Active US9990059B2 (en) | 2014-05-23 | 2015-03-23 | Ink modes |
US14/665,369 Active 2035-06-12 US10275050B2 (en) | 2014-05-23 | 2015-03-23 | Ink for a shared interactive space |
US14/665,330 Abandoned US20150338939A1 (en) | 2014-05-23 | 2015-03-23 | Ink Modes |
US14/665,462 Abandoned US20150338940A1 (en) | 2014-05-23 | 2015-03-23 | Pen Input Modes for Digital Ink |
US14/665,413 Abandoned US20150339050A1 (en) | 2014-05-23 | 2015-03-23 | Ink for Interaction |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/665,282 Active US9990059B2 (en) | 2014-05-23 | 2015-03-23 | Ink modes |
US14/665,369 Active 2035-06-12 US10275050B2 (en) | 2014-05-23 | 2015-03-23 | Ink for a shared interactive space |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/665,462 Abandoned US20150338940A1 (en) | 2014-05-23 | 2015-03-23 | Pen Input Modes for Digital Ink |
US14/665,413 Abandoned US20150339050A1 (en) | 2014-05-23 | 2015-03-23 | Ink for Interaction |
Country Status (1)
Country | Link |
---|---|
US (5) | US9990059B2 (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160048318A1 (en) * | 2014-08-15 | 2016-02-18 | Microsoft Technology Licensing, Llc | Detecting selection of digital ink |
US9990059B2 (en) | 2014-05-23 | 2018-06-05 | Microsoft Technology Licensing, Llc | Ink modes |
US20180276858A1 (en) * | 2017-03-22 | 2018-09-27 | Microsoft Technology Licensing, Llc | Digital Ink Based Visual Components |
US20180329610A1 (en) * | 2017-05-15 | 2018-11-15 | Microsoft Technology Licensing, Llc | Object Selection Mode |
US10228775B2 (en) * | 2016-01-22 | 2019-03-12 | Microsoft Technology Licensing, Llc | Cross application digital ink repository |
US10366153B2 (en) * | 2003-03-12 | 2019-07-30 | Microsoft Technology Licensing, Llc | System and method for customizing note flags |
US10599320B2 (en) | 2017-05-15 | 2020-03-24 | Microsoft Technology Licensing, Llc | Ink Anchoring |
US11361153B1 (en) | 2021-03-16 | 2022-06-14 | Microsoft Technology Licensing, Llc | Linking digital ink instances using connecting lines |
US11372486B1 (en) | 2021-03-16 | 2022-06-28 | Microsoft Technology Licensing, Llc | Setting digital pen input mode using tilt angle |
US11435893B1 (en) | 2021-03-16 | 2022-09-06 | Microsoft Technology Licensing, Llc | Submitting questions using digital ink |
US20220391084A1 (en) * | 2019-09-25 | 2022-12-08 | Zhangyue Technology Co., Ltd | Information display method, reader, computer storage medium, ink screen reading device and screen projection display system |
US11526659B2 (en) | 2021-03-16 | 2022-12-13 | Microsoft Technology Licensing, Llc | Converting text to digital ink |
US11605187B1 (en) * | 2020-08-18 | 2023-03-14 | Corel Corporation | Drawing function identification in graphics applications |
US20230315271A1 (en) * | 2022-03-18 | 2023-10-05 | Sony Group Corporation | Collaborative whiteboard for meetings |
US11875543B2 (en) | 2021-03-16 | 2024-01-16 | Microsoft Technology Licensing, Llc | Duplicating and aggregating digital ink instances |
Families Citing this family (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9436659B2 (en) * | 2013-06-21 | 2016-09-06 | 3Rb Llc | Transferring annotations between documents displayed side by side |
CN104102349B (en) * | 2014-07-18 | 2018-04-27 | 北京智谷睿拓技术服务有限公司 | Content share method and device |
JP6074396B2 (en) * | 2014-09-26 | 2017-02-01 | 富士フイルム株式会社 | Layout creation system, server, client, layout creation method, program, and recording medium |
JP6488653B2 (en) * | 2014-11-07 | 2019-03-27 | セイコーエプソン株式会社 | Display device, display control method, and display system |
CN105988568B (en) * | 2015-02-12 | 2020-07-24 | 北京三星通信技术研究有限公司 | Method and device for acquiring note information |
CN106325471B (en) * | 2015-06-19 | 2020-03-24 | 联想(北京)有限公司 | Device and control method |
US9898865B2 (en) * | 2015-06-22 | 2018-02-20 | Microsoft Technology Licensing, Llc | System and method for spawning drawing surfaces |
KR20170001036A (en) * | 2015-06-25 | 2017-01-04 | 엘지전자 주식회사 | Electronic device and control method of the same |
US10380235B2 (en) * | 2015-09-01 | 2019-08-13 | Branchfire, Inc. | Method and system for annotation and connection of electronic documents |
TWM517860U (en) * | 2015-10-16 | 2016-02-21 | 翰碩電子股份有限公司 | Capacitive stylus with eraser |
US20170236318A1 (en) * | 2016-02-15 | 2017-08-17 | Microsoft Technology Licensing, Llc | Animated Digital Ink |
US10296574B2 (en) | 2016-03-28 | 2019-05-21 | Microsoft Technology Licensing, Llc | Contextual ink annotation in a mapping interface |
US20170277673A1 (en) * | 2016-03-28 | 2017-09-28 | Microsoft Technology Licensing, Llc | Inking inputs for digital maps |
US10838502B2 (en) * | 2016-03-29 | 2020-11-17 | Microsoft Technology Licensing, Llc | Sharing across environments |
WO2017172911A1 (en) * | 2016-03-29 | 2017-10-05 | Google Inc. | System and method for generating virtual marks based on gaze tracking |
US10481682B2 (en) * | 2016-03-29 | 2019-11-19 | Google Llc | System and method for generating virtual marks based on gaze tracking |
KR102520398B1 (en) | 2016-05-18 | 2023-04-12 | 삼성전자주식회사 | Electronic Device and Method for Saving User Data |
KR102536148B1 (en) * | 2016-07-20 | 2023-05-24 | 삼성전자주식회사 | Method and apparatus for operation of an electronic device |
US10871880B2 (en) * | 2016-11-04 | 2020-12-22 | Microsoft Technology Licensing, Llc | Action-enabled inking tools |
US10620725B2 (en) * | 2017-02-17 | 2020-04-14 | Dell Products L.P. | System and method for dynamic mode switching in an active stylus |
CN110546601B (en) * | 2017-04-03 | 2023-09-26 | 索尼公司 | Information processing device, information processing method, and program |
US10469274B2 (en) * | 2017-04-15 | 2019-11-05 | Microsoft Technology Licensing, Llc | Live ink presence for real-time collaboration |
US20180300302A1 (en) * | 2017-04-15 | 2018-10-18 | Microsoft Technology Licensing, Llc | Real-Time Collaboration Live Ink |
US10558853B2 (en) | 2017-05-07 | 2020-02-11 | Massachusetts Institute Of Technology | Methods and apparatus for sharing of music or other information |
US10417310B2 (en) * | 2017-06-09 | 2019-09-17 | Microsoft Technology Licensing, Llc | Content inker |
US10732826B2 (en) | 2017-11-22 | 2020-08-04 | Microsoft Technology Licensing, Llc | Dynamic device interaction adaptation based on user engagement |
KR101886010B1 (en) * | 2017-12-28 | 2018-09-10 | 주식회사 네오랩컨버전스 | Electronic device and Driving method thereof |
US20190325244A1 (en) * | 2018-04-20 | 2019-10-24 | Skipy Interactive Pvt Ltd | System and method to enable creative playing on a computing device |
US10872199B2 (en) | 2018-05-26 | 2020-12-22 | Microsoft Technology Licensing, Llc | Mapping a gesture and/or electronic pen attribute(s) to an advanced productivity action |
CN111385683B (en) * | 2020-03-25 | 2022-01-28 | 广东小天才科技有限公司 | Intelligent sound box application control method and intelligent sound box |
US11372518B2 (en) | 2020-06-03 | 2022-06-28 | Capital One Services, Llc | Systems and methods for augmented or mixed reality writing |
US11429203B2 (en) * | 2020-06-19 | 2022-08-30 | Microsoft Technology Licensing, Llc | Tilt-responsive techniques for digital drawing boards |
US11132104B1 (en) * | 2020-10-05 | 2021-09-28 | Huawei Technologies Co., Ltd. | Managing user interface items in a visual user interface (VUI) |
US11630946B2 (en) * | 2021-01-25 | 2023-04-18 | Microsoft Technology Licensing, Llc | Documentation augmentation using role-based user annotations |
US20220244898A1 (en) * | 2021-02-02 | 2022-08-04 | Honeywell International Inc. | Methods and systems for propagating user inputs to different displays |
US11797635B2 (en) * | 2021-04-02 | 2023-10-24 | Relativity Oda Llc | Systems and methods for pre-loading object models |
US20230353611A1 (en) * | 2022-04-29 | 2023-11-02 | Zoom Video Communications, Inc. | Outputs from persistent hybrid collaborative workspaces |
KR20240020926A (en) * | 2022-08-09 | 2024-02-16 | 삼성전자주식회사 | Electronic device for receiving touch input and method for controlling the same |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030214536A1 (en) * | 2002-05-14 | 2003-11-20 | Microsoft Corporation | Lasso select |
US20040017375A1 (en) * | 2002-07-29 | 2004-01-29 | Microsoft Corporation | In-situ digital inking for applications |
US20060018546A1 (en) * | 2004-07-21 | 2006-01-26 | Hewlett-Packard Development Company, L.P. | Gesture recognition |
US20060267967A1 (en) * | 2005-05-24 | 2006-11-30 | Microsoft Corporation | Phrasing extensions and multiple modes in one spring-loaded control |
US20070046649A1 (en) * | 2005-08-30 | 2007-03-01 | Bruce Reiner | Multi-functional navigational device and method |
US20090327501A1 (en) * | 2008-06-27 | 2009-12-31 | Athellina Athsani | Communication access control system and method |
US20110307783A1 (en) * | 2010-06-11 | 2011-12-15 | Disney Enterprises, Inc. | System and method enabling visual filtering of content |
US20140055399A1 (en) * | 2012-08-27 | 2014-02-27 | Samsung Electronics Co., Ltd. | Method and apparatus for providing user interface |
US20150009154A1 (en) * | 2013-07-08 | 2015-01-08 | Acer Incorporated | Electronic device and touch control method thereof |
US20150127681A1 (en) * | 2013-08-13 | 2015-05-07 | Samsung Electronics Co., Ltd. | Electronic device and search and display method of the same |
US20150261378A1 (en) * | 2014-03-14 | 2015-09-17 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
Family Cites Families (60)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5559942A (en) * | 1993-05-10 | 1996-09-24 | Apple Computer, Inc. | Method and apparatus for providing a note for an application program |
US5613019A (en) * | 1993-05-20 | 1997-03-18 | Microsoft Corporation | System and methods for spacing, storing and recognizing electronic representations of handwriting, printing and drawings |
US6362440B1 (en) | 1998-03-27 | 2002-03-26 | International Business Machines Corporation | Flexibly interfaceable portable computing device |
JP3855462B2 (en) | 1998-05-29 | 2006-12-13 | 株式会社日立製作所 | Method for editing command sequence with processing time and apparatus using the same |
US6498601B1 (en) | 1999-11-29 | 2002-12-24 | Xerox Corporation | Method and apparatus for selecting input modes on a palmtop computer |
US7609862B2 (en) * | 2000-01-24 | 2009-10-27 | Pen-One Inc. | Method for identity verification |
US6658147B2 (en) | 2001-04-16 | 2003-12-02 | Parascript Llc | Reshaping freehand drawn lines and shapes in an electronic document |
US7286141B2 (en) | 2001-08-31 | 2007-10-23 | Fuji Xerox Co., Ltd. | Systems and methods for generating and controlling temporary digital ink |
US7096432B2 (en) | 2002-05-14 | 2006-08-22 | Microsoft Corporation | Write anywhere tool |
US20040257346A1 (en) | 2003-06-20 | 2004-12-23 | Microsoft Corporation | Content selection and handling |
US8131647B2 (en) | 2005-01-19 | 2012-03-06 | Amazon Technologies, Inc. | Method and system for providing annotations of a digital work |
CN101576780A (en) | 2005-01-30 | 2009-11-11 | 史翠克有限公司 | Computer mouse peripheral |
US7752561B2 (en) | 2005-03-15 | 2010-07-06 | Microsoft Corporation | Method and system for creating temporary visual indicia |
US7935075B2 (en) | 2005-04-26 | 2011-05-03 | Cardiac Pacemakers, Inc. | Self-deploying vascular occlusion device |
US7526737B2 (en) | 2005-11-14 | 2009-04-28 | Microsoft Corporation | Free form wiper |
US8181103B2 (en) | 2005-12-29 | 2012-05-15 | Microsoft Corporation | Annotation detection and anchoring on ink notes |
US20070156335A1 (en) | 2006-01-03 | 2007-07-05 | Mcbride Sandra Lynn | Computer-Aided Route Selection |
US7839394B2 (en) * | 2007-01-08 | 2010-11-23 | Pegasus Technologies Ltd. | Electronic pen device |
US8194081B2 (en) | 2007-05-29 | 2012-06-05 | Livescribe, Inc. | Animation of audio ink |
US9019245B2 (en) | 2007-06-28 | 2015-04-28 | Intel Corporation | Multi-function tablet pen input device |
US8004498B1 (en) | 2007-10-22 | 2011-08-23 | Adobe Systems Incorporated | Systems and methods for multipoint temporary anchoring |
US8402391B1 (en) * | 2008-09-25 | 2013-03-19 | Apple, Inc. | Collaboration system |
US20130283169A1 (en) | 2012-04-24 | 2013-10-24 | Social Communications Company | Voice-based virtual area navigation |
US9269102B2 (en) | 2009-05-21 | 2016-02-23 | Nike, Inc. | Collaborative activities in on-line commerce |
US8179417B2 (en) * | 2009-07-22 | 2012-05-15 | Hewlett-Packard Development Company, L.P. | Video collaboration |
US20110143769A1 (en) | 2009-12-16 | 2011-06-16 | Microsoft Corporation | Dual display mobile communication device |
US20110166777A1 (en) | 2010-01-07 | 2011-07-07 | Anand Kumar Chavakula | Navigation Application |
US8261213B2 (en) | 2010-01-28 | 2012-09-04 | Microsoft Corporation | Brush, carbon-copy, and fill gestures |
US9323807B2 (en) | 2010-11-03 | 2016-04-26 | Sap Se | Graphical manipulation of data objects |
US20120159351A1 (en) * | 2010-12-21 | 2012-06-21 | International Business Machines Corporation | Multiple reviews of graphical user interfaces |
US9201520B2 (en) * | 2011-02-11 | 2015-12-01 | Microsoft Technology Licensing, Llc | Motion and context sharing for pen-based computing inputs |
WO2013016165A1 (en) * | 2011-07-22 | 2013-01-31 | Social Communications Company | Communicating between a virtual area and a physical space |
US9344684B2 (en) | 2011-08-05 | 2016-05-17 | Honeywell International Inc. | Systems and methods configured to enable content sharing between client terminals of a digital video management system |
US9285903B1 (en) * | 2011-09-28 | 2016-03-15 | Amazon Technologies, Inc. | Stylus and electronic display |
US9948988B2 (en) * | 2011-10-04 | 2018-04-17 | Ricoh Company, Ltd. | Meeting system that interconnects group and personal devices across a network |
US20130205189A1 (en) * | 2012-01-25 | 2013-08-08 | Advanced Digital Systems, Inc. | Apparatus And Method For Interacting With An Electronic Form |
US9557878B2 (en) * | 2012-04-25 | 2017-01-31 | International Business Machines Corporation | Permitting participant configurable view selection within a screen sharing session |
US9876988B2 (en) | 2012-07-13 | 2018-01-23 | Microsoft Technology Licensing, Llc | Video display modification for video conferencing environments |
US20140026076A1 (en) | 2012-07-17 | 2014-01-23 | Jacquilene Jacob | Real-time interactive collaboration system |
US20140047330A1 (en) * | 2012-08-09 | 2014-02-13 | Sap Ag | Collaborative decision making in contract documents |
US20140136985A1 (en) * | 2012-11-12 | 2014-05-15 | Moondrop Entertainment, Llc | Method and system for sharing content |
KR20140065764A (en) * | 2012-11-21 | 2014-05-30 | 한국전자통신연구원 | System and method for function expandable collaboration screen system |
US9389717B2 (en) | 2012-12-14 | 2016-07-12 | Microsoft Technology Licensing, Llc | Reducing latency in ink rendering |
KR101984592B1 (en) * | 2013-01-04 | 2019-05-31 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
KR20140111497A (en) * | 2013-03-11 | 2014-09-19 | 삼성전자주식회사 | Method for deleting item on touch screen, machine-readable storage medium and portable terminal |
US9304609B2 (en) | 2013-03-12 | 2016-04-05 | Lenovo (Singapore) Pte. Ltd. | Suspending tablet computer by stylus detection |
US9690403B2 (en) | 2013-03-15 | 2017-06-27 | Blackberry Limited | Shared document editing and voting using active stylus based touch-sensitive displays |
US20140282103A1 (en) * | 2013-03-16 | 2014-09-18 | Jerry Alan Crandall | Data sharing |
US20140328505A1 (en) * | 2013-05-02 | 2014-11-06 | Microsoft Corporation | Sound field adaptation based upon user tracking |
US20150033149A1 (en) | 2013-07-23 | 2015-01-29 | Saleforce.com, inc. | Recording and playback of screen sharing sessions in an information networking environment |
US20150052430A1 (en) * | 2013-08-13 | 2015-02-19 | Dropbox, Inc. | Gestures for selecting a subset of content items |
US10044979B2 (en) | 2013-08-19 | 2018-08-07 | Cisco Technology, Inc. | Acquiring regions of remote shared content with high resolution |
JP2015049604A (en) * | 2013-08-30 | 2015-03-16 | 株式会社東芝 | Electronic apparatus and method for displaying electronic document |
US9575948B2 (en) * | 2013-10-04 | 2017-02-21 | Nook Digital, Llc | Annotation of digital content via selective fixed formatting |
US10282056B2 (en) | 2013-12-24 | 2019-05-07 | Dropbox, Inc. | Sharing content items from a collection |
US9544257B2 (en) | 2014-04-04 | 2017-01-10 | Blackberry Limited | System and method for conducting private messaging |
US9268928B2 (en) | 2014-04-06 | 2016-02-23 | International Business Machines Corporation | Smart pen system to restrict access to security sensitive devices while continuously authenticating the user |
US20150304376A1 (en) | 2014-04-17 | 2015-10-22 | Shindig, Inc. | Systems and methods for providing a composite audience view |
US9906614B2 (en) | 2014-05-05 | 2018-02-27 | Adobe Systems Incorporated | Real-time content sharing between browsers |
US9990059B2 (en) | 2014-05-23 | 2018-06-05 | Microsoft Technology Licensing, Llc | Ink modes |
-
2015
- 2015-03-23 US US14/665,282 patent/US9990059B2/en active Active
- 2015-03-23 US US14/665,369 patent/US10275050B2/en active Active
- 2015-03-23 US US14/665,330 patent/US20150338939A1/en not_active Abandoned
- 2015-03-23 US US14/665,462 patent/US20150338940A1/en not_active Abandoned
- 2015-03-23 US US14/665,413 patent/US20150339050A1/en not_active Abandoned
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030214536A1 (en) * | 2002-05-14 | 2003-11-20 | Microsoft Corporation | Lasso select |
US20040017375A1 (en) * | 2002-07-29 | 2004-01-29 | Microsoft Corporation | In-situ digital inking for applications |
US20060018546A1 (en) * | 2004-07-21 | 2006-01-26 | Hewlett-Packard Development Company, L.P. | Gesture recognition |
US20060267967A1 (en) * | 2005-05-24 | 2006-11-30 | Microsoft Corporation | Phrasing extensions and multiple modes in one spring-loaded control |
US20070046649A1 (en) * | 2005-08-30 | 2007-03-01 | Bruce Reiner | Multi-functional navigational device and method |
US20090327501A1 (en) * | 2008-06-27 | 2009-12-31 | Athellina Athsani | Communication access control system and method |
US20110307783A1 (en) * | 2010-06-11 | 2011-12-15 | Disney Enterprises, Inc. | System and method enabling visual filtering of content |
US20140055399A1 (en) * | 2012-08-27 | 2014-02-27 | Samsung Electronics Co., Ltd. | Method and apparatus for providing user interface |
US20150009154A1 (en) * | 2013-07-08 | 2015-01-08 | Acer Incorporated | Electronic device and touch control method thereof |
US20150127681A1 (en) * | 2013-08-13 | 2015-05-07 | Samsung Electronics Co., Ltd. | Electronic device and search and display method of the same |
US20150261378A1 (en) * | 2014-03-14 | 2015-09-17 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10366153B2 (en) * | 2003-03-12 | 2019-07-30 | Microsoft Technology Licensing, Llc | System and method for customizing note flags |
US10275050B2 (en) | 2014-05-23 | 2019-04-30 | Microsoft Technology Licensing, Llc | Ink for a shared interactive space |
US9990059B2 (en) | 2014-05-23 | 2018-06-05 | Microsoft Technology Licensing, Llc | Ink modes |
US20160048318A1 (en) * | 2014-08-15 | 2016-02-18 | Microsoft Technology Licensing, Llc | Detecting selection of digital ink |
US10228775B2 (en) * | 2016-01-22 | 2019-03-12 | Microsoft Technology Licensing, Llc | Cross application digital ink repository |
US10930045B2 (en) * | 2017-03-22 | 2021-02-23 | Microsoft Technology Licensing, Llc | Digital ink based visual components |
US20180276858A1 (en) * | 2017-03-22 | 2018-09-27 | Microsoft Technology Licensing, Llc | Digital Ink Based Visual Components |
US20180329610A1 (en) * | 2017-05-15 | 2018-11-15 | Microsoft Technology Licensing, Llc | Object Selection Mode |
US10599320B2 (en) | 2017-05-15 | 2020-03-24 | Microsoft Technology Licensing, Llc | Ink Anchoring |
US20220391084A1 (en) * | 2019-09-25 | 2022-12-08 | Zhangyue Technology Co., Ltd | Information display method, reader, computer storage medium, ink screen reading device and screen projection display system |
US11605187B1 (en) * | 2020-08-18 | 2023-03-14 | Corel Corporation | Drawing function identification in graphics applications |
US11361153B1 (en) | 2021-03-16 | 2022-06-14 | Microsoft Technology Licensing, Llc | Linking digital ink instances using connecting lines |
US11372486B1 (en) | 2021-03-16 | 2022-06-28 | Microsoft Technology Licensing, Llc | Setting digital pen input mode using tilt angle |
US11435893B1 (en) | 2021-03-16 | 2022-09-06 | Microsoft Technology Licensing, Llc | Submitting questions using digital ink |
US11526659B2 (en) | 2021-03-16 | 2022-12-13 | Microsoft Technology Licensing, Llc | Converting text to digital ink |
US11875543B2 (en) | 2021-03-16 | 2024-01-16 | Microsoft Technology Licensing, Llc | Duplicating and aggregating digital ink instances |
US20230315271A1 (en) * | 2022-03-18 | 2023-10-05 | Sony Group Corporation | Collaborative whiteboard for meetings |
Also Published As
Publication number | Publication date |
---|---|
US20150338938A1 (en) | 2015-11-26 |
US10275050B2 (en) | 2019-04-30 |
US20150341400A1 (en) | 2015-11-26 |
US20150338940A1 (en) | 2015-11-26 |
US20150339050A1 (en) | 2015-11-26 |
US9990059B2 (en) | 2018-06-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150338939A1 (en) | Ink Modes | |
US11916861B2 (en) | Displaying interactive notifications on touch sensitive devices | |
JP6435305B2 (en) | Device, method and graphical user interface for navigating a list of identifiers | |
US11423209B2 (en) | Device, method, and graphical user interface for classifying and populating fields of electronic forms | |
US11656758B2 (en) | Interacting with handwritten content on an electronic device | |
KR102367838B1 (en) | Device, method, and graphical user interface for managing concurrently open software applications | |
KR102056175B1 (en) | Method of making augmented reality contents and terminal implementing the same | |
US8826164B2 (en) | Device, method, and graphical user interface for creating a new folder | |
US20170003812A1 (en) | Method for providing a feedback in response to a user input and a terminal implementing the same | |
US9141256B2 (en) | Portable electronic device and method therefor | |
CN105144094B (en) | System and method for managing the navigation in application | |
KR101947458B1 (en) | Method and apparatus for managing message | |
US10331321B2 (en) | Multiple device configuration application | |
US20140160049A1 (en) | Clipboard function control method and apparatus of electronic device | |
CN112083815A (en) | Pre-formed answers in messages | |
US20130311922A1 (en) | Mobile device with memo function and method for controlling the device | |
JP6439266B2 (en) | Text input method and apparatus in electronic device with touch screen | |
US20230306192A1 (en) | Comment adding method, electronic device, and related apparatus | |
KR20240005099A (en) | Devices, methods, and graphical user interfaces for automatically providing shared content to applications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VONG, WILLIAM H.;REEL/FRAME:035229/0922 Effective date: 20150316 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |