EP2380075B1 - Touch sensitive computing device and method - Google Patents
Touch sensitive computing device and method Download PDFInfo
- Publication number
- EP2380075B1 EP2380075B1 EP09838574.3A EP09838574A EP2380075B1 EP 2380075 B1 EP2380075 B1 EP 2380075B1 EP 09838574 A EP09838574 A EP 09838574A EP 2380075 B1 EP2380075 B1 EP 2380075B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- information management
- personal information
- touch sensitive
- computing device
- journal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 28
- 230000032683 aging Effects 0.000 claims description 16
- 238000012795 verification Methods 0.000 claims description 5
- 208000018999 crinkle Diseases 0.000 claims description 3
- 238000009877 rendering Methods 0.000 claims description 3
- 230000037303 wrinkles Effects 0.000 claims description 3
- 238000004383 yellowing Methods 0.000 claims description 3
- 239000002689 soil Substances 0.000 claims description 2
- 230000001186 cumulative effect Effects 0.000 description 6
- 230000003993 interaction Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000001953 sensory effect Effects 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 210000003811 finger Anatomy 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000003490 calendering Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000002845 discoloration Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000013515 script Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- User interfaces for computing devices are often rigid and complex. Users may be required to navigate multiple layers to perform a task, record thoughts, organize information, or access content. Current user interfaces typically lack the flexibility and ease of use available with a simple analogue journal. As a result users often still rely on pen and paper for unstructured writing, sketching, note taking, and calendaring in everyday life. Users may also spend time copying and/or translating information form a paper-based format into a digital form, for example, by inputting the information into a computing device.
- US 2002/0114516 A1 discloses a handwriting recognition input system including a handwriting input area and a plurality of activatable controls. Each activatable control is associated with a different reference library, and each is adapted to recognize characters input on the handwriting input area as belonging to the associated reference library.
- a method for handwriting recognition is disclosed which includes the steps of selecting at least one character set from among a plurality of character sets and recognizing at least one character using the at least one selected character set.
- Fig. 1 is a system diagram illustrating a freeform interaction architecture 10 that may be used with a computing device 12.
- the architecture 10 includes modules.
- the modules may be software modules that may reside in a memory 11.
- a processor 15 may be configured to execute the software modules.
- the architecture 10 utilizes a touch sensitive display 14 configured to receive a touch input 16 on, for example, a surface 18 of the touch sensitive display 14.
- a graphical user interface 19 may be configured to provide various features to help a user provide input to the computing device 12, and/or to provide feedback and information to a user.
- the graphical user interface 19 may be coupled with the software modules via a GUI module 21.
- the touch input 16 is illustrated here with a generally oval shape in dashed line.
- a gesture recognition module 20 is coupled with the touch sensitive display 14, and configured to receive the touch input 16, and to determine, based on the touch input 16 and by selecting from a plurality of predefined freeform gestures a recognized gesture 23, also shown with the generally oval shape in dashed line.
- the gesture recognition module 20 may be included as part of a touch input recognition module 22.
- a personal information management (PIM) module 24 is coupled with the gesture recognition module 20.
- the PIM module 24 includes a PIM database 26.
- the PIM module 24 is configured to apply a selected item 28 shown on the touch sensitive display 14 to a selected one of a plurality of different PIM schemas 30 of the PIM database 26.
- the selected one of the PIM schemas 30 is selected based on the recognized gesture 23. In some cases, when the selected item 28 is applied the selected item 28 may also be saved in a mass storage 36.
- the plurality of PIM schemas 30 include one or more of a calendar item 44, a task item 46, and a contact item 48. Other PIM schemas 50 may be included.
- the touch input 16 may generally be considered to include both the selected item 28 and the gesture made with an input device 17. In this illustrated example the touch input 16 may be considered as being made after the selected item 28 is present. It will also be understood that the selected item 28 may be from various forms of input.
- a character recognition module 52 is configured to recognize alphanumeric characters included with the selected item 28 upon recognition of the recognized gesture 23.
- the character recognition module 52 is also configured to cause the PIM module 24 to save the alphanumeric characters into the PIM database 26 as a searchable item.
- the character recognition module 52 may also be configured to cause the alphanumeric characters to be saved in the mass storage 36.
- the touch sensitive display 14 may be configured to display the selected item 28 on the touch sensitive display 14 within a journal page graphic 31.
- the selected item 28 may be the result of an earlier touch input 16 made with the input device 17.
- Various objects may serve as an input device, such as a pen, or a stylus, or a finger, or the like.
- the input may be, or include anything that may be possible to add to a piece of paper with, for example, a pen, or a pencil. For example, without limitation, markings such a drawing, a doodle, a note, an address, numbers and the like.
- other items may be included such photos, or digital documents, and the like.
- the other items may be, for example, without limitation, inputted into the computing device 12, created by the computing device 12, or retrieved from the mass storage 36.
- the computing device 12 may include an ink rendering module 32 configured to display the selected item 28 as part of a selected journal page graphic 31 as a marked-up journal page graphic.
- the ink rendering module 32 may also provide the user with selectable choices, such as line type, line width and line color.
- a data-holding subsystem may be operatively coupled with the touch sensitive display 14 and may be configured to hold a plurality of journal page graphics.
- the plurality of journal page graphics may be a plurality of hierarchically flat sequentially arranged journal page graphics 70 discussed below.
- An analog input module 34 may be included in the data-holding subsystem, and may be configured to cause the marked-up journal page graphic to be saved in the mass storage 36.
- a page module 33 may provide additional journal page graphics 31, and/or may order the journal page graphics 31.
- the computing device 12 may also include an aging module 40 operatively coupled with the touch sensitive display 14 and the data holding subsystem.
- the aging module may provide what may be referred to as paper physics.
- the aging module 40 may be configured to display each of the journal page graphics 30 on the touch sensitive display 14 as a paper page from a paper journal.
- the aging module 40 may be configured to selectively control display of aging characteristics to the journal page graphic 31.
- the aging characteristics may be variably applied.
- the aging characteristics of each journal page may be based on at least one of an age of such journal page and an amount of usage associated with such journal page.
- the selected journal page graphic 31 may be configured to change appearance in accordance with one, or both, of an elapsed time from a creation of the selected journal page graphic 31, or a cumulative duration of a time since the journal page graphic received a first touch input 16.
- a clock/calendar module 42 may provide a measure of the duration of time.
- the architecture 10 may also be configured to execute various other applications 50.
- the other application 50 may be executable upon recognition of other predefined gestures.
- various command buttons 52 may be used.
- the various other applications 50, and or the applying of different PIM schemas 30 may also utilize a search module 54.
- the search module 54 may be coupled with, or able to be coupled with, a computer network 56, such as, for example, the Internet.
- the architecture 10 may include a location module 58 coupled with, or able to be coupled with, a GPS system 60.
- a particular location where a user was located when a particular touch input 16 was made may be saved in the PIM database 26, and or the mass storage 36.
- the location of the computing device 12 when an input was made may then later, be included as part of various search criteria that may be used to recall an item included in a PIM schema 30.
- the location of where any particular touch input 16 was made, whether part of a PIM schema 30 or not, may be used to locate a particular journal page graphic 31.
- the user of the computing device 12, described herein may use various other sensory cues such as the age, and/or general appearance of a journal page graphic 31.
- the architecture 10, computing device 12, and methods, described herein may be configured to provide these sensory cues.
- Fig. 2 is a schematic illustration of a computing device 12 that may be used for receiving and saving analog input from an input device 17, and/or digital input.
- Fig. 3 is a schematic view illustrating a plurality of hierarchically flat sequentially arranged journal pages 70.
- the computing device 12 may include a user interface 19 configured to selectively display the plurality of hierarchically flat sequentially arranged journal pages 70 on a touch sensitive display 14.
- One journal page graphics 31 is shown in Fig. 2 .
- the user interface 19 may be configured to apply a selected item 28 on a selected one of the journal page graphics 31 to a selected one of a plurality of PIM schemas 30 in response to a touch input 16 applied to the touch sensitive display 14.
- the computing device 12 may include an aging module 40 ( Fig. 1 ) configured to control aging appearance characteristics for each touch input 16 applied to the touch sensitive display 14.
- Each of the journal page graphics 31 may be configured to appear on the touch sensitive display 14 as a paper page from a paper journal.
- the aging module 40 may be configured to redisplay selected journal page graphics 31 at one or more later times with predetermined graphical modifications 72 according to predetermined temporal conditions and/or predetermined use conditions.
- the predetermined graphical modifications 72 may be included as sensory cues to enhance the journaling experience, and as aids to recall, as discussed.
- the selected journal page graphic 31 may be configured to change appearance, i.e. with the addition of the predetermined graphical modifications 72, in accordance with one, or both, of an elapsed time from a creation of the selected journal page graphic, or a cumulative duration of a time that the journal page graphic has been selected among the plurality of hierarchically flat sequentially arranged journal pages 70.
- the predetermined temporal conditions may include a duration from a first use of the selected one of the journal page graphics, and the predetermined use conditions include a can amount of use incurred by the selected one of the journal page graphics.
- the use module 40 may be configured to add an appearance of one or more of discoloration such as yellowing, smudges, crinkles, wrinkles, rips, smudges, soil marks, and folds to the selected journal page graphic 31 in accordance with the cumulative duration as the selected journal page graphic 31, and/or a cumulative amount of analog input having been made to the computing device while the journal page graphic is the selected one of the journal page graphics 31.
- discoloration such as yellowing, smudges, crinkles, wrinkles, rips, smudges, soil marks, and folds
- the user interface 19 may cause one of a plurality of overlay navigational tools to be displayed on the touch sensitive display 14 to help a user navigate the possible actions and reactions of the computing device 12 by verifying its' occurrence.
- the gesture recognition module 20 may be configured so that upon determination of the recognized gesture 23, the gesture recognition module 20 may causes the touch sensitive display 14 to display verification 80 of the application of the selected item to the selected one of the plurality of PIM schemas. After a preselected time, for example one second, of being visible on the touch sensitive display 14, the verification 80 may disappear.
- the verification 80 may take various forms, for example a box with a label corresponding to the PIM schema.
- a navigation arrow 82 may also be included with the verification 80.
- journal pages 70 may be advanced, or reversed, or otherwise scrolled backward and forward. In some cases this may be accomplished by running a users finger, or another type of input device in an upward or downward direction as illustrated with arrow 90 at location 92.
- Other activation techniques may be used. The activation techniques may be referred to as a scroll page gesture.
- Fig. 2 illustrates an analog input area on a right side 94 of the computing device 12, and various other features on a left side 96.
- the left side 96 may be configured for analog input and the right side 94 may be configured to display other features.
- Left side 96 analog input may be the preferred configuration for a left handed person, and vise versa.
- the computing device 12 may include a spine 98 between the left side 96 and the right side 94.
- the computing device 12 may be foldable at the spine similar to a paper journal.
- An example command button 52 is illustrated on the spine.
- Fig. 4 is a schematic illustration of an example architectural structure 100.
- the architectural structure 100 illustrates icons that may be selectable to, for example, enable various PIM schemas 30, various other applications 50, or other schemas.
- various notebooks, each containing various number of displayable journal page graphics 31 may be selected , by selecting corresponding icons, which may then make the selected notebook visible on the touch sensitive display 14.
- Fig. 5 is a flowchart illustrating an embodiment of a method 200 for receiving and processing touch inputs in a computing device.
- Method 200 may be implemented using systems and devices described above, or using other suitable hardware and software.
- the method 200 includes, at 202, receiving a touch input at a touch sensitive display.
- the touch input may be from an input device made on a surface of a touch sensitive display of a computing device.
- the method may also include recognizing a predetermined gesture made on the surface as a request to save the touch input into a searchable database.
- Step 204 includes determining, based on the touch input and by selecting from a plurality of different predefined gestures, a recognized gesture that corresponds to the touch input.
- the method includes applying a selected item shown on the touch sensitive display to a selected one of a plurality of PIM schemas, the selected one of the PIM schemas being selected based on the recognized gesture.
- the method also includes recognizing alphanumeric characters made with the touch input at least upon recognizing the predetermined gesture.
- the method also includes saving the alphanumeric characters into the searchable database as a searchable item, e.g. one or more of a task item, a calendar item, and a contact item.
- the method may also include displaying a journal page graphic on the interface surface, to appear as paper journal page from a paper journal.
- the method may also include saving the pen stroke input as a simulated writing implement marking on the journal page graphic.
- the method may also include modifying the appearance of the journal page graphic in accordance with preselected aging, and/or use criteria.
- Fig. 6 is a flowchart illustrating an example variation of the method 200 shown in Fig. 5 .
- the method 200 may further include at 216, selectively displaying additional journal page graphics upon receiving a scroll page gesture from the input device on the interface surface.
- the additional journal page graphics are from the plurality of hierarchically flat sequentially arranged journal pages 70 shown in Fig. 4 .
- Fig. 7 is a flowchart illustrating another example variation of method 200 shown in Fig. 5 .
- the modifying the appearance 214 may include, at 218, adding a color to the journal page graphic to give the journal page graphic a yellowing appearance in accordance with an elapsed time from when the journal page graphic received a first touch input.
- Fig. 8 is a flowchart illustrating another example variation of the method 200 shown in Fig. 5 .
- the modifying the appearance 214 may include, at 220, adding graphical features to the journal page graphic that appear as one or more of crinkles, smudges, wrinkles, rips, and soiling in accordance with a cumulative duration of a time the journal graphic has been displayed, and/or a cumulative number of touch inputs received on the surface of the touch sensitive display.
- the computing devices described herein may be any suitable computing device configured to execute the programs described herein.
- the computing devices may be a mainframe computer, personal computer, laptop computer, portable data assistant (PDA), computer-enabled wireless telephone, networked computing device, enhanced mobile telephone device, or other suitable computing device, and may be connected to each other via computer networks, such as the Internet.
- PDA portable data assistant
- These computing devices typically include a processor and associated volatile and non-volatile memory, and are configured to execute programs stored in non-volatile memory using portions of volatile memory and the processor.
- program refers to software or firmware components that may be executed by, or utilized by, one or more computing devices described herein, and is meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc. It will be appreciated that computer-readable media may be provided having program instructions stored thereon, which upon execution by a computing device, cause the computing device to execute the methods described above and cause operation of the systems described above.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Description
- User interfaces for computing devices are often rigid and complex. Users may be required to navigate multiple layers to perform a task, record thoughts, organize information, or access content. Current user interfaces typically lack the flexibility and ease of use available with a simple analogue journal. As a result users often still rely on pen and paper for unstructured writing, sketching, note taking, and calendaring in everyday life. Users may also spend time copying and/or translating information form a paper-based format into a digital form, for example, by inputting the information into a computing device.
-
US 2002/0114516 A1 discloses a handwriting recognition input system including a handwriting input area and a plurality of activatable controls. Each activatable control is associated with a different reference library, and each is adapted to recognize characters input on the handwriting input area as belonging to the associated reference library. A method for handwriting recognition is disclosed which includes the steps of selecting at least one character set from among a plurality of character sets and recognizing at least one character using the at least one selected character set. - Touch sensitive computing devices and methods are provided, as defined by the appended claims.
-
-
Fig. 1 is a system diagram illustrating a freeform interaction architecture in accordance with the present disclosure; -
Fig. 2 is a schematic illustration of a computing device that may include the freeform interaction architecture illustrated inFig. 1 ; -
Fig. 3 is a schematic view illustrating a plurality of hierarchically flat sequentially arranged journal pages that may be selectively displayed on the computing device illustrated inFig. 2 ; -
Fig. 4 is a schematic illustration of an example architectural structure in accordance with the present disclosure; and -
Figs. 5 through 8 are flowcharts illustrating various example methods for receiving and processing touch inputs in a computing device. -
Fig. 1 is a system diagram illustrating afreeform interaction architecture 10 that may be used with acomputing device 12. Thearchitecture 10 includes modules. The modules may be software modules that may reside in amemory 11. Aprocessor 15 may be configured to execute the software modules. Thearchitecture 10 utilizes a touchsensitive display 14 configured to receive atouch input 16 on, for example, asurface 18 of the touchsensitive display 14. Agraphical user interface 19 may be configured to provide various features to help a user provide input to thecomputing device 12, and/or to provide feedback and information to a user. Thegraphical user interface 19 may be coupled with the software modules via aGUI module 21. - The
touch input 16 is illustrated here with a generally oval shape in dashed line. Agesture recognition module 20 is coupled with the touchsensitive display 14, and configured to receive thetouch input 16, and to determine, based on thetouch input 16 and by selecting from a plurality of predefined freeform gestures a recognizedgesture 23, also shown with the generally oval shape in dashed line. Thegesture recognition module 20 may be included as part of a touchinput recognition module 22. A personal information management (PIM)module 24 is coupled with thegesture recognition module 20. ThePIM module 24 includes aPIM database 26. ThePIM module 24 is configured to apply a selecteditem 28 shown on the touchsensitive display 14 to a selected one of a plurality ofdifferent PIM schemas 30 of thePIM database 26. The selected one of thePIM schemas 30 is selected based on the recognizedgesture 23. In some cases, when theselected item 28 is applied theselected item 28 may also be saved in a mass storage 36. The plurality ofPIM schemas 30 include one or more of acalendar item 44, atask item 46, and acontact item 48.Other PIM schemas 50 may be included. Note, in some cases, as when theselected item 28 includes markings made by a user, thetouch input 16 may generally be considered to include both theselected item 28 and the gesture made with aninput device 17. In this illustrated example thetouch input 16 may be considered as being made after theselected item 28 is present. It will also be understood that theselected item 28 may be from various forms of input. - A
character recognition module 52 is configured to recognize alphanumeric characters included with the selecteditem 28 upon recognition of the recognizedgesture 23. Thecharacter recognition module 52 is also configured to cause thePIM module 24 to save the alphanumeric characters into thePIM database 26 as a searchable item. Thecharacter recognition module 52 may also be configured to cause the alphanumeric characters to be saved in the mass storage 36. - In some examples, the touch
sensitive display 14 may be configured to display theselected item 28 on the touchsensitive display 14 within ajournal page graphic 31. As mentioned, theselected item 28 may be the result of anearlier touch input 16 made with theinput device 17. Various objects may serve as an input device, such as a pen, or a stylus, or a finger, or the like. The input may be, or include anything that may be possible to add to a piece of paper with, for example, a pen, or a pencil. For example, without limitation, markings such a drawing, a doodle, a note, an address, numbers and the like. In addition other items may be included such photos, or digital documents, and the like. The other items may be, for example, without limitation, inputted into thecomputing device 12, created by thecomputing device 12, or retrieved from the mass storage 36. - The
computing device 12 may include anink rendering module 32 configured to display theselected item 28 as part of a selectedjournal page graphic 31 as a marked-up journal page graphic. Theink rendering module 32 may also provide the user with selectable choices, such as line type, line width and line color. - A data-holding subsystem may be operatively coupled with the touch
sensitive display 14 and may be configured to hold a plurality of journal page graphics. The plurality of journal page graphics may be a plurality of hierarchically flat sequentially arrangedjournal page graphics 70 discussed below. Ananalog input module 34 may be included in the data-holding subsystem, and may be configured to cause the marked-up journal page graphic to be saved in the mass storage 36. Apage module 33 may provide additionaljournal page graphics 31, and/or may order thejournal page graphics 31. - The
computing device 12 may also include anaging module 40 operatively coupled with the touchsensitive display 14 and the data holding subsystem. The aging module may provide what may be referred to as paper physics. Theaging module 40 may be configured to display each of thejournal page graphics 30 on the touchsensitive display 14 as a paper page from a paper journal. Theaging module 40 may be configured to selectively control display of aging characteristics to thejournal page graphic 31. The aging characteristics may be variably applied. The aging characteristics of each journal page may be based on at least one of an age of such journal page and an amount of usage associated with such journal page. In some examples, the selectedjournal page graphic 31 may be configured to change appearance in accordance with one, or both, of an elapsed time from a creation of the selectedjournal page graphic 31, or a cumulative duration of a time since the journal page graphic received afirst touch input 16. A clock/calendar module 42 may provide a measure of the duration of time. - The
architecture 10 may also be configured to execute variousother applications 50. Theother application 50 may be executable upon recognition of other predefined gestures. As an alternative to, or in addition to, using gesture recognition to execute the variousother applications 50, or to apply one or more selecteditems 28 on the touchsensitive display 14 to a selected one of the plurality ofdifferent PIM schemas 30 as described,various command buttons 52 may be used. The variousother applications 50, and or the applying of different PIM schemas 30 may also utilize asearch module 54. Thesearch module 54 may be coupled with, or able to be coupled with, acomputer network 56, such as, for example, the Internet. - In addition, the
architecture 10 may include alocation module 58 coupled with, or able to be coupled with, aGPS system 60. A particular location where a user was located when aparticular touch input 16 was made may be saved in thePIM database 26, and or the mass storage 36. The location of thecomputing device 12 when an input was made, may then later, be included as part of various search criteria that may be used to recall an item included in aPIM schema 30. In addition, or alternatively, the location of where anyparticular touch input 16 was made, whether part of aPIM schema 30 or not, may be used to locate a particular journal page graphic 31. This may be useful, for example, if a user is able to remember where they were when an input was made, such as on a trip, or at a meeting, but can not find the particular journal page. This may be akin to helping a user find a particular one page in a large paper journal based on the user's memory of where they were when they wrote down, or sketched something, but would otherwise have to thumb through most, or all of the paper journal to find it. In addition, or as an alternative, to using geographic cues to find a particular journal page, the user of thecomputing device 12, described herein, may use various other sensory cues such as the age, and/or general appearance of a journal page graphic 31. Thearchitecture 10,computing device 12, and methods, described herein may be configured to provide these sensory cues. -
Fig. 2 is a schematic illustration of acomputing device 12 that may be used for receiving and saving analog input from aninput device 17, and/or digital input.Fig. 3 is a schematic view illustrating a plurality of hierarchically flat sequentially arranged journal pages 70. Thecomputing device 12 may include auser interface 19 configured to selectively display the plurality of hierarchically flat sequentially arranged journal pages 70 on a touchsensitive display 14. Onejournal page graphics 31 is shown inFig. 2 . Theuser interface 19 may be configured to apply a selecteditem 28 on a selected one of thejournal page graphics 31 to a selected one of a plurality of PIM schemas 30 in response to atouch input 16 applied to the touchsensitive display 14. This may occur upon recognizing a predefined freeform gesture, as a recognizedgesture 23, made with theinput device 17. Thecomputing device 12 may include an aging module 40 (Fig. 1 ) configured to control aging appearance characteristics for eachtouch input 16 applied to the touchsensitive display 14. Each of thejournal page graphics 31 may be configured to appear on the touchsensitive display 14 as a paper page from a paper journal. The agingmodule 40 may be configured to redisplay selectedjournal page graphics 31 at one or more later times with predeterminedgraphical modifications 72 according to predetermined temporal conditions and/or predetermined use conditions. The predeterminedgraphical modifications 72 may be included as sensory cues to enhance the journaling experience, and as aids to recall, as discussed. - The selected journal page graphic 31 may be configured to change appearance, i.e. with the addition of the predetermined
graphical modifications 72, in accordance with one, or both, of an elapsed time from a creation of the selected journal page graphic, or a cumulative duration of a time that the journal page graphic has been selected among the plurality of hierarchically flat sequentially arranged journal pages 70. The predetermined temporal conditions may include a duration from a first use of the selected one of the journal page graphics, and the predetermined use conditions include a can amount of use incurred by the selected one of the journal page graphics. - The
use module 40 may be configured to add an appearance of one or more of discoloration such as yellowing, smudges, crinkles, wrinkles, rips, smudges, soil marks, and folds to the selected journal page graphic 31 in accordance with the cumulative duration as the selected journal page graphic 31, and/or a cumulative amount of analog input having been made to the computing device while the journal page graphic is the selected one of thejournal page graphics 31. - In some cases, upon recognizing a predefined
freeform gesture 23, theuser interface 19 may cause one of a plurality of overlay navigational tools to be displayed on the touchsensitive display 14 to help a user navigate the possible actions and reactions of thecomputing device 12 by verifying its' occurrence. Thegesture recognition module 20 may be configured so that upon determination of the recognizedgesture 23, thegesture recognition module 20 may causes the touchsensitive display 14 to displayverification 80 of the application of the selected item to the selected one of the plurality of PIM schemas. After a preselected time, for example one second, of being visible on the touchsensitive display 14, theverification 80 may disappear. Theverification 80 may take various forms, for example a box with a label corresponding to the PIM schema. Anavigation arrow 82 may also be included with theverification 80. - In some cases the plurality of hierarchically flat sequentially arranged journal pages 70 may be advanced, or reversed, or otherwise scrolled backward and forward. In some cases this may be accomplished by running a users finger, or another type of input device in an upward or downward direction as illustrated with
arrow 90 atlocation 92. Other activation techniques may be used. The activation techniques may be referred to as a scroll page gesture. -
Fig. 2 illustrates an analog input area on aright side 94 of thecomputing device 12, and various other features on aleft side 96. This is an example configuration. In some cases both sides may be configured for analog input. In some cases theleft side 96 may be configured for analog input and theright side 94 may be configured to display other features.Left side 96 analog input may be the preferred configuration for a left handed person, and vise versa. Thecomputing device 12 may include aspine 98 between theleft side 96 and theright side 94. Thecomputing device 12 may be foldable at the spine similar to a paper journal. Anexample command button 52 is illustrated on the spine. - The various other features on the
left side 96 of the journal structure illustrated inFig. 2 may enable various other functions besides those discussed in detail herein. Various features may be, for example selectable icons may be made selectively visible, and may be scrollable.Fig. 4 is a schematic illustration of an examplearchitectural structure 100. Thearchitectural structure 100 illustrates icons that may be selectable to, for example, enable various PIM schemas 30, variousother applications 50, or other schemas. In addition, various notebooks, each containing various number of displayablejournal page graphics 31 may be selected , by selecting corresponding icons, which may then make the selected notebook visible on the touchsensitive display 14. -
Fig. 5 is a flowchart illustrating an embodiment of amethod 200 for receiving and processing touch inputs in a computing device.Method 200 may be implemented using systems and devices described above, or using other suitable hardware and software. Themethod 200 includes, at 202, receiving a touch input at a touch sensitive display. The touch input may be from an input device made on a surface of a touch sensitive display of a computing device. At 204, the method may also include recognizing a predetermined gesture made on the surface as a request to save the touch input into a searchable database. Step 204 includes determining, based on the touch input and by selecting from a plurality of different predefined gestures, a recognized gesture that corresponds to the touch input. The method includes applying a selected item shown on the touch sensitive display to a selected one of a plurality of PIM schemas, the selected one of the PIM schemas being selected based on the recognized gesture. At 206, the method also includes recognizing alphanumeric characters made with the touch input at least upon recognizing the predetermined gesture. At 208, the method also includes saving the alphanumeric characters into the searchable database as a searchable item, e.g. one or more of a task item, a calendar item, and a contact item. At 210, the method may also include displaying a journal page graphic on the interface surface, to appear as paper journal page from a paper journal. At 212, the method may also include saving the pen stroke input as a simulated writing implement marking on the journal page graphic. In addition, and 206, the method may also include modifying the appearance of the journal page graphic in accordance with preselected aging, and/or use criteria. -
Fig. 6 is a flowchart illustrating an example variation of themethod 200 shown inFig. 5 . Themethod 200 may further include at 216, selectively displaying additional journal page graphics upon receiving a scroll page gesture from the input device on the interface surface. In some cases the additional journal page graphics are from the plurality of hierarchically flat sequentially arranged journal pages 70 shown inFig. 4 . -
Fig. 7 is a flowchart illustrating another example variation ofmethod 200 shown inFig. 5 . The modifying the appearance 214 (Fig. 5 ) may include, at 218, adding a color to the journal page graphic to give the journal page graphic a yellowing appearance in accordance with an elapsed time from when the journal page graphic received a first touch input. -
Fig. 8 is a flowchart illustrating another example variation of themethod 200 shown inFig. 5 . The modifying the appearance 214 (Fig. 5 ) may include, at 220, adding graphical features to the journal page graphic that appear as one or more of crinkles, smudges, wrinkles, rips, and soiling in accordance with a cumulative duration of a time the journal graphic has been displayed, and/or a cumulative number of touch inputs received on the surface of the touch sensitive display. - It will be appreciated that the computing devices described herein may be any suitable computing device configured to execute the programs described herein. For example, the computing devices may be a mainframe computer, personal computer, laptop computer, portable data assistant (PDA), computer-enabled wireless telephone, networked computing device, enhanced mobile telephone device, or other suitable computing device, and may be connected to each other via computer networks, such as the Internet. These computing devices typically include a processor and associated volatile and non-volatile memory, and are configured to execute programs stored in non-volatile memory using portions of volatile memory and the processor. As used herein, the term "program" refers to software or firmware components that may be executed by, or utilized by, one or more computing devices described herein, and is meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc. It will be appreciated that computer-readable media may be provided having program instructions stored thereon, which upon execution by a computing device, cause the computing device to execute the methods described above and cause operation of the systems described above.
- It should be understood that the embodiments herein are illustrative and not restrictive, since the scope of the invention is defined by the appended claims rather than by the description preceding them.
Claims (11)
- A touch sensitive computing device (12), comprising:a touch sensitive display (14) configured to receive a touch input (16);a personal information management module (24) including a personal information management database (26); anda character recognition module (52), the device being characterized in that:the device further comprises a gesture recognition module (20) coupled with the touch sensitive display (14) and configured to receive the touch input (16) and determine, based on the touch input (16) and by selecting from a plurality of different predefined gestures, a recognized gesture (23) that corresponds to the touch input (16);the personal information management module (24) is coupled to the gesture recognition module (20) and is configured to apply a selected item (28) shown on the touch sensitive display (14), the selected item (28) resulting from the touch input (16), to a selected one of a plurality of different personal information management schemas (30) of the personal information management database (26), the plurality of different personal information management schemas (30) including one or more of a calendar item, a task item, and a contact item, the selected one of the personal information management schemas (30) being selected based on the recognized gesture (23); and in thatthe character recognition module (52) is configured to recognize alphanumeric characters included with the selected item (28) upon recognition of the recognized gesture, and to cause the personal information management module (24) to save the alphanumeric characters into the personal information management database (26) as a searchable item.
- The touch sensitive computing device (12) of claim 1, further comprising a data-holding subsystem (35) operatively coupled with the touch sensitive display (14) and configured to hold a plurality of hierarchically flat sequentially arranged journal pages, where the touch sensitive display (14) is configured to selectively display the journal pages, and where the selected item (28) is graphically disposed on one of the plurality of journal pages.
- The touch sensitive computing device (12) of claim 1 or 2, further comprising an ink rendering module (32) configured to display the selected item (28) as part of a selected journal page graphic as a marked-up journal page graphic, and an analog input module (34) configured to cause the marked-up journal page graphic to be saved in a mass storage (36).
- The touch sensitive computing device (12) of claim 2 or 3, further comprising an aging module (40) operatively coupled with the touch sensitive display (14) and the data-holding subsystem (35) and configured to selectively control display of aging characteristics for the plurality of journal pages.
- The touch sensitive computing device (12) of claim 4, where the aging characteristics are variably applied to the plurality of journal pages and are selected from the group consisting of yellowing, smudges, crinkles, wrinkles, rips, folds, and soil marks.
- The touch sensitive computing device (12) of claim 4 or 5, where for a given one of the plurality of journal pages, the aging module (40) is configured to control aging characteristics of such journal page based on at least one of an age of such journal page and an amount of usage associated with such journal page.
- The touch sensitive computing device (12) of claim 1, where the gesture recognition module (20) is configured so that upon determination of the recognized gesture, the gesture recognition module (20) causes the touch sensitive display (14) to display verification of the application of the selected item (28) to the selected one of the plurality of personal information management schemas (30).
- A method of receiving and processing touch inputs in a computing device (12) having a touch sensitive display (14), a personal information management module (24) including a personal information management database (26), and a character recognition module (52), the method comprising:
receiving (202) a touch input at the touch sensitive display (14), the method being characterized by:determining (204), by a gesture recognition module (20) of the computing device (12), based on the touch input (16) and by selecting from a plurality of different predefined gestures, a recognized gesture (23) that corresponds to the touch input (16) and, upon recognition of the recognized gesture, recognizing (206), by the character recognition module (52), alphanumeric characters included with a selected item (28) resulting from the touch input (16);applying, by the personal information management module (24), the selected item (28) shown on the touch sensitive display (14) to a selected one of a plurality of different personal information management schemas (30) of the personal information management database (26), the plurality of different personal information management schemas (30) including one or more of a calendar item, a task item, and a contact item, the selected one of the personal information management schemas (30) being selected based on the recognized gestures (23); andsaving (208), by the personal information management module (24), the alphanumeric characters into the personal information management database (26) as a searchable item. - The method of claim 8, further comprising displaying (210) a journal page graphic on an interface surface, to appear as paper journal page from a paper journal.
- The method of claim 9, further comprising saving (212) a pen stroke input as a simulated writing implement marking on the journal page graphic.
- A computer readable medium comprising instructions thereon that are executable by a computing device to perform any one of the methods of claims 8-10.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/356,045 US8319736B2 (en) | 2009-01-19 | 2009-01-19 | Touch sensitive computing device and method |
PCT/US2009/069321 WO2010083012A2 (en) | 2009-01-19 | 2009-12-22 | Touch sensitive computing device and method |
Publications (3)
Publication Number | Publication Date |
---|---|
EP2380075A2 EP2380075A2 (en) | 2011-10-26 |
EP2380075A4 EP2380075A4 (en) | 2012-08-22 |
EP2380075B1 true EP2380075B1 (en) | 2020-06-10 |
Family
ID=42336548
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP09838574.3A Active EP2380075B1 (en) | 2009-01-19 | 2009-12-22 | Touch sensitive computing device and method |
Country Status (7)
Country | Link |
---|---|
US (1) | US8319736B2 (en) |
EP (1) | EP2380075B1 (en) |
JP (1) | JP5377665B2 (en) |
KR (1) | KR101663849B1 (en) |
CN (1) | CN102282533B (en) |
RU (1) | RU2011129799A (en) |
WO (1) | WO2010083012A2 (en) |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080168402A1 (en) | 2007-01-07 | 2008-07-10 | Christopher Blumenberg | Application Programming Interfaces for Gesture Operations |
US20080168478A1 (en) | 2007-01-07 | 2008-07-10 | Andrew Platzer | Application Programming Interfaces for Scrolling |
US8677285B2 (en) | 2008-02-01 | 2014-03-18 | Wimm Labs, Inc. | User interface of a small touch sensitive display for an electronic data and communication device |
US8645827B2 (en) | 2008-03-04 | 2014-02-04 | Apple Inc. | Touch event model |
US20140160030A1 (en) * | 2009-02-09 | 2014-06-12 | Cypress Semiconductor Corporation | Sensor system and method for mapping and creating gestures |
US9424578B2 (en) | 2009-02-24 | 2016-08-23 | Ebay Inc. | System and method to provide gesture functions at a device |
US8566045B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
US9684521B2 (en) | 2010-01-26 | 2017-06-20 | Apple Inc. | Systems having discrete and continuous gesture recognizers |
KR20120053430A (en) * | 2010-11-17 | 2012-05-25 | 삼성전자주식회사 | Device and method for providing image effect in wireless terminal |
US20120159373A1 (en) * | 2010-12-15 | 2012-06-21 | Verizon Patent And Licensing, Inc. | System for and method of generating dog ear bookmarks on a touch screen device |
JP2012216148A (en) * | 2011-04-01 | 2012-11-08 | Sharp Corp | Display device, display method, computer program, and recording medium |
KR101802759B1 (en) * | 2011-05-30 | 2017-11-29 | 엘지전자 주식회사 | Mobile terminal and Method for controlling display thereof |
US9720574B2 (en) | 2012-03-19 | 2017-08-01 | Microsoft Technology Licensing, Llc | Personal notes on a calendar item |
US10032135B2 (en) | 2012-03-19 | 2018-07-24 | Microsoft Technology Licensing, Llc | Modern calendar system including free form input electronic calendar surface |
US9508056B2 (en) | 2012-03-19 | 2016-11-29 | Microsoft Technology Licensing, Llc | Electronic note taking features including blank note triggers |
CN103218075B (en) * | 2013-03-26 | 2016-12-28 | 深圳市金立通信设备有限公司 | The touch-control monitoring method of a kind of Touch Screen and terminal |
JP6098295B2 (en) * | 2013-03-28 | 2017-03-22 | 富士通株式会社 | Display program, display device, and display method |
US9733716B2 (en) | 2013-06-09 | 2017-08-15 | Apple Inc. | Proxy gesture recognizer |
KR102186819B1 (en) * | 2013-08-27 | 2020-12-07 | 삼성전자주식회사 | A mobile terminal supportting a note function and a method controlling the same |
US10942206B2 (en) * | 2014-08-04 | 2021-03-09 | Nokia Shanghai Bell Co., Ltd. | Variable passive intermodulation load |
US10372398B2 (en) | 2017-04-04 | 2019-08-06 | Microsoft Technology Licensing, Llc | Foldable display device with interactable user interface surface on the external shell |
US11182542B2 (en) * | 2018-10-29 | 2021-11-23 | Microsoft Technology Licensing, Llc | Exposing annotations in a document |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6335725B1 (en) * | 1999-07-14 | 2002-01-01 | Hewlett-Packard Company | Method of partitioning a touch screen for data input |
US20040186729A1 (en) * | 2003-03-11 | 2004-09-23 | Samsung Electronics Co., Ltd. | Apparatus for and method of inputting Korean vowels |
US20050289452A1 (en) * | 2004-06-24 | 2005-12-29 | Avaya Technology Corp. | Architecture for ink annotations on web documents |
US20060026074A1 (en) * | 2004-07-30 | 2006-02-02 | Katsumi Fujimoto | Article sales data processing apparatus |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69428675T2 (en) | 1993-12-30 | 2002-05-08 | Xerox Corp | Apparatus and method for supporting an implicit structuring of free-form lists, overviews, texts, tables and diagrams in an input system and editing system based on hand signals |
US6064384A (en) * | 1996-08-26 | 2000-05-16 | E-Brook Systems Pte Ltd | Computer user interface system and method having book image features |
JPH10247988A (en) * | 1997-03-06 | 1998-09-14 | Sharp Corp | Information processing unit with communication function |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US6952803B1 (en) | 1998-12-29 | 2005-10-04 | Xerox Corporation | Method and system for transcribing and editing using a structured freeform editor |
JP2000222522A (en) * | 1999-02-04 | 2000-08-11 | Matsushita Electric Ind Co Ltd | Recognition and processing device |
US6459442B1 (en) | 1999-09-10 | 2002-10-01 | Xerox Corporation | System for applying application behaviors to freeform data |
US6859909B1 (en) | 2000-03-07 | 2005-02-22 | Microsoft Corporation | System and method for annotating web-based documents |
US7177473B2 (en) | 2000-12-12 | 2007-02-13 | Nuance Communications, Inc. | Handwriting data input device with multiple character sets |
GB2380583A (en) * | 2001-10-04 | 2003-04-09 | Ilam Samson | Touch pad/screen for electronic equipment |
US20030131059A1 (en) | 2002-01-08 | 2003-07-10 | International Business Machines Corporation | Method, system, and program for providing information on scheduled events to wireless devices |
US7137077B2 (en) | 2002-07-30 | 2006-11-14 | Microsoft Corporation | Freeform encounter selection tool |
US20040119762A1 (en) | 2002-12-24 | 2004-06-24 | Fuji Xerox Co., Ltd. | Systems and methods for freeform pasting |
US7262785B2 (en) | 2003-08-21 | 2007-08-28 | Microsoft Corporation | Ink editing architecture |
JP4137043B2 (en) * | 2004-10-29 | 2008-08-20 | 株式会社コナミデジタルエンタテインメント | GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD |
JP2006260065A (en) * | 2005-03-16 | 2006-09-28 | Fuji Xerox Co Ltd | Document browsing device and document browsing program |
US8018431B1 (en) * | 2006-03-29 | 2011-09-13 | Amazon Technologies, Inc. | Page turner for handheld electronic book reader device |
US7864163B2 (en) * | 2006-09-06 | 2011-01-04 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying structured electronic documents |
KR100861353B1 (en) | 2006-09-11 | 2008-10-02 | 엘지전자 주식회사 | Navigation system with multifunction and opertation method thereof |
KR101335771B1 (en) | 2007-01-19 | 2013-12-02 | 엘지전자 주식회사 | Electronic Device With Touch Screen And Method Of Inputting Information Using Same |
US7475487B1 (en) * | 2008-02-14 | 2009-01-13 | Parchmint, Inc. | Height recording album |
KR101509245B1 (en) * | 2008-07-31 | 2015-04-08 | 삼성전자주식회사 | User interface apparatus and method for using pattern recognition in handy terminal |
US20100162165A1 (en) * | 2008-12-22 | 2010-06-24 | Apple Inc. | User Interface Tools |
-
2009
- 2009-01-19 US US12/356,045 patent/US8319736B2/en active Active
- 2009-12-22 JP JP2011546252A patent/JP5377665B2/en active Active
- 2009-12-22 WO PCT/US2009/069321 patent/WO2010083012A2/en active Application Filing
- 2009-12-22 KR KR1020117016518A patent/KR101663849B1/en active IP Right Grant
- 2009-12-22 RU RU2011129799/08A patent/RU2011129799A/en not_active Application Discontinuation
- 2009-12-22 EP EP09838574.3A patent/EP2380075B1/en active Active
- 2009-12-22 CN CN200980155304.2A patent/CN102282533B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6335725B1 (en) * | 1999-07-14 | 2002-01-01 | Hewlett-Packard Company | Method of partitioning a touch screen for data input |
US20040186729A1 (en) * | 2003-03-11 | 2004-09-23 | Samsung Electronics Co., Ltd. | Apparatus for and method of inputting Korean vowels |
US20050289452A1 (en) * | 2004-06-24 | 2005-12-29 | Avaya Technology Corp. | Architecture for ink annotations on web documents |
US20060026074A1 (en) * | 2004-07-30 | 2006-02-02 | Katsumi Fujimoto | Article sales data processing apparatus |
Also Published As
Publication number | Publication date |
---|---|
EP2380075A4 (en) | 2012-08-22 |
JP2012515396A (en) | 2012-07-05 |
KR20110113616A (en) | 2011-10-17 |
WO2010083012A3 (en) | 2010-10-14 |
JP5377665B2 (en) | 2013-12-25 |
US8319736B2 (en) | 2012-11-27 |
EP2380075A2 (en) | 2011-10-26 |
WO2010083012A2 (en) | 2010-07-22 |
US20100182246A1 (en) | 2010-07-22 |
RU2011129799A (en) | 2013-01-27 |
KR101663849B1 (en) | 2016-10-07 |
CN102282533B (en) | 2015-06-17 |
CN102282533A (en) | 2011-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2380075B1 (en) | Touch sensitive computing device and method | |
CN101334706B (en) | Text input window with auto-growth | |
US7002560B2 (en) | Method of combining data entry of handwritten symbols with displayed character data | |
CN106415446B (en) | Accessibility detection of content attributes through haptic interaction | |
US9274704B2 (en) | Electronic apparatus, method and storage medium | |
US20040021647A1 (en) | Enhanced on-object context menus | |
US20150123988A1 (en) | Electronic device, method and storage medium | |
US20130300675A1 (en) | Electronic device and handwritten document processing method | |
Kristensson et al. | Command strokes with and without preview: using pen gestures on keyboard for command selection | |
US20160140387A1 (en) | Electronic apparatus and method | |
CN102722476A (en) | A method and device for marking electronic documents | |
CN102385475B (en) | Electronic device and interactive method thereof | |
JP6426417B2 (en) | Electronic device, method and program | |
US8938123B2 (en) | Electronic device and handwritten document search method | |
US7562314B2 (en) | Data processing apparatus and method | |
WO2015136645A1 (en) | Electronic device, method, and program | |
WO2014147719A1 (en) | Electronic device, and method for processing handwritten document | |
US20150098653A1 (en) | Method, electronic device and storage medium | |
US20150026179A1 (en) | Electronic device and method for processing clips of documents | |
US20140222825A1 (en) | Electronic device and method for searching handwritten document | |
US8624837B1 (en) | Methods and apparatus related to a scratch pad region of a computing device | |
US20110055692A1 (en) | Digital media device and method for managing data for thereon | |
JP5596068B2 (en) | Electronic terminal and book browsing program | |
JP2010165120A (en) | Device and method for displaying electronic information | |
US20130339346A1 (en) | Mobile terminal and memo search method for the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20110707 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20120724 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/048 20060101AFI20120718BHEP |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20170803 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20200110 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1279714 Country of ref document: AT Kind code of ref document: T Effective date: 20200615 Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602009062235 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200911 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200910 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20200610 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200910 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1279714 Country of ref document: AT Kind code of ref document: T Effective date: 20200610 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201012 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201010 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602009062235 Country of ref document: DE |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 |
|
26N | No opposition filed |
Effective date: 20210311 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20201231 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20201222 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20201222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20201231 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20201231 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200610 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20201231 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230501 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20231121 Year of fee payment: 15 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20231122 Year of fee payment: 15 Ref country code: DE Payment date: 20231121 Year of fee payment: 15 |