WO2005122130B1 - User created interactive interface - Google Patents

User created interactive interface

Info

Publication number
WO2005122130B1
WO2005122130B1 PCT/US2005/017883 US2005017883W WO2005122130B1 WO 2005122130 B1 WO2005122130 B1 WO 2005122130B1 US 2005017883 W US2005017883 W US 2005017883W WO 2005122130 B1 WO2005122130 B1 WO 2005122130B1
Authority
WO
WIPO (PCT)
Prior art keywords
recognizing
stylus
options
computer code
audio output
Prior art date
Application number
PCT/US2005/017883
Other languages
French (fr)
Other versions
WO2005122130A2 (en
WO2005122130A3 (en
Inventor
James Marggraff
Alex Chisholm
Tc Edgecomb
Nathaniel A Fast
Original Assignee
Leapfrog Entpr Inc
James Marggraff
Alex Chisholm
Tc Edgecomb
Nathaniel A Fast
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/861,243 external-priority patent/US20060033725A1/en
Application filed by Leapfrog Entpr Inc, James Marggraff, Alex Chisholm, Tc Edgecomb, Nathaniel A Fast filed Critical Leapfrog Entpr Inc
Priority to CA002527240A priority Critical patent/CA2527240A1/en
Priority to JP2006525552A priority patent/JP2007504565A/en
Priority to EP05753583A priority patent/EP1665222A4/en
Publication of WO2005122130A2 publication Critical patent/WO2005122130A2/en
Publication of WO2005122130A3 publication Critical patent/WO2005122130A3/en
Publication of WO2005122130B1 publication Critical patent/WO2005122130B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus

Abstract

An interactive apparatus (100) includes a stylus housing (62), a processor (32) coupled to the stylus housing (62), and a memory unit (48) comprising (i) computer code for recognizing a plurality of graphic elements created using a stylus, (ii) computer code for recognizing the selection of at least two of the graphic elements in a user defined sequence using the stylus, and (iii) computer code for playing at least one audio output that relates to the formed graphic elements, and an audio output device (36).

Claims

AMENDED CLAIMS received by the International Bureau on 15 August 2006 (15.08.06)
1. A method comprising: creating a graphic element using a stylus; listening to an audio recitation of at least one menu item in a plurality of menu items after creating the graphic element; and selecting a menu item from the plurality of menu items.
2. A method comprising: recognizing a graphic element, the graphic element created by a handheld device; generating an audio recitation of at least one menu item out of a plurality of menu items after recognition of the graphic element; and recognizing a selection of a menu item from the plurality of menu items upon a subsequent actuation of the handheld device, the actuation being related to the graphic element.
3. The method of claim 1 or 2 wherein the stylus or handheld device is in the form of an interactive apparatus comprising a processor, an emitter, a detector, and a speaker, wherein the emitter, detector, and the speaker are operatively coupled to the processor.
4. The method of claim 1 or 2 wherein the graphic element is on a printable surface.
5. The method of claim 1 or 2 wherein the graphic element is a print element and wherein the stylus or handheld device comprises an antenna.
6. The method of claim 4 wherein the printable surface is a sheet of paper.
7. The method of claim 1 or 2 wherein the graphic element includes a symbol.
8. The method of claim 1 or 2 wherein the graphic element includes a symbol and a line circumscribing the symbol.
9. The method of claim 1 or 2 wherein after the user selects or recognition of the selection of the menu item, a speech synthesizer operatively associated with the stylus or handheld device audibly recites instructions creating additional graphic elements.
10. The method of claim 1 or 2 wherein the plurality of menu items include at least one of calculator menu item, a reference menu item, and a games menu item.
11. An interactive apparatus comprising: a stylus housing; a processor coupled to the stylus housing; a memory unit comprising (i) computer code for recognizing a graphic element created by the user using the stylus, (ii) computer code for playing an audio recitation of at least one menu item in a plurality of menu items after the graphic element is created by the user, and (iii) computer code for
32 recognizing a user selection of a menu item from the plurality of menu items; and an audio output device, wherein the audio output device and the memory unit are operatively coupled to the processor.
12. An interactive apparatus comprising: a handheld device housing; a processor coupled to the handheld device housing; a memory unit comprising:
(i) computer code for recognizing a graphic element created by handheld device,
(ii) computer code for playing an audio recitation of at least one menu item in a plurality of menu items after the graphic element is created by the user; and
(iii) computer code for recognizing a user selection of a menu item from the plurality of menu items; and an audio output device, wherein the audio output device and the memory unit are operatively coupled to the processor.
13. The interactive apparatus of claim 11 or 12 wherein the processor, the audio output device, and the memory unit are in the stylus or handheld device housing and wherein the processor/ the audio output device and the memory are in a platform that is coupled to the stylus or handheld housing.
14. The interactive apparatus of claim 11 or 12 wherein the processor, the memory unit, and the audio output device are all in the stylus or handheld device housing, and wherein the stylus or handheld device housing further comprises an optical emitter and an optical detector coupled to the processor.
15. The interactive apparatus of claim 11 or 12 wherein the graphic elements include letters or numbers.
16. The interactive apparatus of claim 11 or 12 wherein the memory unit comprises computer code for recognizing substantially invisible codes on an article.
17. The interactive apparatus of claim 11 or 12 wherein the memory unit comprises computer code for recognizing substantially invisible codes on an article, and wherein the substantially invisible codes are in the from of dot codes.
18. The interactive apparatus of claim 11 or 12 wherein the memory unit comprises computer code for recognizing substantially invisible codes on an article, and wherein the substantially invisible codes are in the form of dot codes that encode a relative or absolute position.
19. A system comprising;
(a) the interactive apparatus of claim 18; and
(b) an article including the substantially invisible codes.
20. A system comprising:
33 (a) the interactive apparatus of claim 18; and
(b) an article including the substantially invisible codes, wherein the article includes a sheet of paper.
21. A system comprising:
(a) the interactive apparatus of claim 18; and
(b) an article including the substantially invisible codes, wherein the article includes a sheet of paper that is free of any pre-printing.
22. A system comprising:
(a) the interactive apparatus of claim 18; and
(b) an article including the substantially invisible codes, wherein the article includes a sheet of paper that includes pre-printing.
23. A system comprising:
(a) an interactive apparatus comprising a stylus housing, a processor coupled to the stylus housing, a speech synthesizer, a memory unit comprising (i) computer code for allowing a user to create a graphic element using the stylus, (ii) computer code for playing an audio recitation of at least one menu item in a plurality of menu items after creating the graphic element, (iii) computer code for allowing a user to select a menu item from the plurality of menu items, and (iv) computer code for recognizing substantially invisible codes on an article, and wherein the substantially invisible codes are in the form of dot codes that encode a relative or absolute position, and an audio output device, wherein the speech synthesizer, the audio output device and the memory unit are operatively coupled to the processor, and wherein the speech synthesizer, the audio output device, the processor and the memory unit are in the stylus housing; and
(b) the article .
24. A system comprising:
(a) an interactive apparatus comprising a stylus housing, a processor coupled to the stylus housing, a speech synthesizer, a memory unit comprising (i) computer code for allowing a user to create a graphic element using the stylus, (ii) computer code for playing an audio recitation of at least one menu item in a plurality of menu items after creating the graphic element, (iii) computer code for allowing a user to select a menu item from the plurality of menu items, and (iv) computer code for recognizing substantially invisible codes on an article, and wherein the substantially invisible codes are in the form of dot codes that encode a relative or absolute position, and an audio output device, wherein the speech synthesizer, the audio output device and the memory unit are operatively coupled to the processor, and wherein the speech synthesizer, the audio output device, the processor and the memory unit are in the stylus housing.
25. The system of claim 23 or 24 wherein the substantially invisible codes are dot codes.
26. The system of daim 23 or 24 wherein the memory unit comprises computer code for causing the interactive apparatus to recite the menu items after each sequential selection of the graphic element.
27. The system of claim 23 or 24 wherein the substantially invisible codes relate to the absolute positions on the article.
28. The system of claim 23 or 24 wherein there is an article and /or the article includes a sheet of paper that is free of pre-printed print elements.
29. The system of claim 23 or 24 wherein there is an article and /or the article includes a sheet of paper that includes pre-printed print elements.
30. The system of claim 23 or 24 wherein the graphic element includes one selected from the group consisting of at least one of indicium, and a combination of at least one indicum and a line circumscribing the at least one indicium.
31. A method comprising: forming a plurality of graphic elements using a stylus; selecting at least two of the graphic elements in a user defined sequence using the stylus; and listening to at least one audio output that relates to the formed graphic elements.
32. A method comprising: recognizing a plurality of created graphic elements on a surface; recognizing a selection of at least one of the graphic elements, the selection implemented by a stylus upon an, actuation of the stylus related to the at least one graphic element; accessing a function related to the at least one graphic element; providing at least one audio output in accordance with the function.
33. The method of claim 31 or 32 wherein the plurality of graphic elements comprise a plurality of numbers and mathematical operators, and wherein selecting or recognizing the selection comprises selecting or recognizing the selection of a first number, a first mathematical operator, a second number, and a second mathematical operator, wherein the first number, the first mathematical operator, and the second mathematical operator together form a math problem, and wherein the at least one audio output that relates to the selected first number, first mathematical operator, and the second mathematical operator comprises the answer to the math problem.
34. The method of claim 31 or 32 wherein the stylus comprises an emitter, a detector, a processor, and a speaker, wherein the emitter, detector, and the speaker are coupled to the processor, and wherein the stylus is coupled to a platform, which supports a sheet upon which the graphic elements are formed.
35. The method of claim 31 or 32 wherein the graphic elements comprise letters.
35
36. The method of claim 31 or 32 wherein the graphic elements comprise a first graphic element comprising a name of a language and a second graphic element comprising a word that is in a language that is different than the language, and wherein selecting or recognizing the selection includes selecting or recognizing the selection of the word and then selecting or recognizing the selection of the name of the language, and wherein at least one audio output includes a synthesized voice audibly rendering the word in the language.
37. The method of claim 36 wherein the language is a non-English language and wherein the word is in English.
38. The method of claim 31 or 32 wherein the stylus comprises a writing element, and wherein the graphic elements are user created graphic elements on a sheet and are generated in conjunction with the stylus.
39. The method of claim 38 wherein the sheet includes a plurality of substantially invisible codes.
40. The method of claim 39 wherein the sheet is free of pre-printed print elements.
41. An interactive apparatus comprising: a stylus housing; a processor coupled to the stylus housing; a memory unit comprising (i) computer code for recognizing a plurality of graphic elements created using a stylus, (ii) computer code for recognizing the selection of at least two of the graphic elements in a user defined sequence using the stylus, and (iii) computer code for playing at least one audio output that relates to the formed graphic elements; and an audio output device, wherein the audio output device and the memory unit are operatively coupled to the processor.
42. An interactive apparatus comprising: a device housing; a processor coupled to the device housing; a memory unit comprising
(i) computer code for recognizing a plurality of graphic elements created using a device,
(ii) computer code for recognizing the selection of at least two of the graphic elements in a user defined sequence using the device, and
(iii) computer code for playing at least one audio output that relates to the formed graphic elements; and an audio output device, wherein the audio output device and the memory unit are operatively coupled to the processor.
43. The interactive apparatus of claim 41 or 42 wherein the stylus or device comprises a writing element.
44. The interactive apparatus of claim 41 or 42 wherein the processor, the memory unit and the audio output device are in the stylus or device housing.
36
45. The interactive apparatus of claim 41 or 42 wherein the memory unit further comprises computer code for recognizing substantially invisible codes printed on an article.
46. The interactive apparatus of claim 41 or 42 wherein the memory unit further comprises computer code for recognizing substantially invisible codes printed on an article, wherein the substantially invisible codes comprise dot codes.
47. The interactive apparatus of claim 41 or 42 wherein the apparatus further comprises a platform and wherein the memory, the processor, and the audio output device are in the platform wherein the graphic elements comprise numbers and wherein the memory unit further comprises code for calculating numbers.
48. The interactive apparatus of claim 41 or 42 wherein the interactive aparatus comprises a writing element that is retractable.
49. The interactive apparatus of claim 41 or 42 wherein the memory unit further comprises computer code for teaching about at least one of letters, numbers, and phonics.
50. The interactive apparatus of claim 41 or 42 wherein the memory unit comprises computer code for causing a synthesized voice to recite a plurality of menu items.
51. A system- comprising: an interactive apparatus comprising a stylus housing, a processor coupled to the stylus housing, a memory unit comprising (i) computer code for recognizing a plurality of graphic elements created using a stylus, (ii) computer code for recognizing the selection of at least two of the graphic elements in a user defined sequence using the stylus, and (iii) computer code for playing at least one audio output that relates to the formed graphic elements, and an audio output device, wherein the audio output device and the memory unit are operatively coupled to the processor; and an article upon which the graphic elements are created.
52. A system comprising: an interactive device comprising a device housing, a processor coupled to the housing, a memory unit comprising (i) computer code for recognizing a plurality of graphic elements created using a device, (ii) computer code for recognizing the selection of at least two of the graphic elements in a user defined sequence using the device, and (iii) computer code for playing at least one audio output that relates to the formed graphic elements, and an audio output device, wherein the audio output device and the memory unit are operatively coupled to the processor.
53. The system of claim 52 further comprising an article upon which the graphic elements are created.
37
54. The system of claim 51 or 53 wherein the article comprises a sheet of paper and wherein the sheet of paper includes a plurality of substantially invisible codes.
55. The system of claim 54 wherein the substantially invisible codes are dot codes.
56. The system of claim 51 or 53 wherein the article comprises a sheet of paper and wherein the sheet of paper includes a plurality of substantially invisible codes wherein the substantially invisible codes include relative or absolute position information.
57. The system of claim 51 or 53 wherein the article comprises a sheet of paper and wherein the sheet of paper includes a plurality of substantially invisible codes, wherein the codes are dot codes, and wherein the sheet of paper is substantially free of pre-printed print elements.
58. The system of claim 51 or 52 wherein the processor, the audio output device, and the memory unit are in the stylus or device housing andl wherein the interactive apparatus or device is in the form of a self contained device.
59. The system of claim 51 wherein the memory unit comprises computer code for a plurality of menu items.
60. The system of claim 51 wherein the memory unit includes computer code for an English-foreign language dictionary.
61. A method for interpreting user commands, comprising: recognizing a created graphical element on a surface; accessing a function related to the graphical element; providing an output in accordance with the function; and associating the function with the graphical element.
62. The method of Claim 61, wherein the output comprises an audio output related to the function.
63. The method of Claim 61, further comprising: enabling a subsequent access of the function in response to a subsequent selection of the graphical element by storing the association of the function with the graphical element.
64. The method of Claim 63, wherein the storing of the association of the function with the graphical element implements a persistent availability of the function, for a predetermined amount of time, via interaction with the graphical element.
65. The method of Claim 61, wherein the graphical element is created by a pen device on the surface.
66. The method of Claim 65, wherein the surface comprises a sheet of paper,
38
67. The method of Claim 61, further comprising: accessing one of a plurality of functions related to the graphical element by interpreting at least one actuation of the graphical element, wherein the at least one actuation selects the one of the plurality of functions.
68. The method of Claim 67, wherein the at least one actuation comprises recognizing at least one tap of the graphical element.
69. The method of Claim 67, further comprising: providing one of a plurality of audio outputs when the one of the plurality of functions is selected.
70. The method of Claim 67, wherein the plurality of functions comprises a predetermined menu of options.
71. The method of Claim 67, wherein the plurality of functions comprises a plurality of configuration options of an application related to the graphical element,
72. The method of Claim 71, wherein at least one of the plurality of configuration options comprises a default configuration of the application.
73. The method of Claim 71, further comprising: implementing a hierarchy of functions; and providing access to the hierarchy of functions via a corresponding hierarchy of graphical elements.
74. The method of Claim 73, further comprising; recognizing at least one actuation of the graphical element to select a first hierarchical level function; prompting the creation of a second graphical element; recognizing at least one actuation of the second graphical element to select a second hierarchical level function; providing an audio output related to the second hierarchical level function; and associating the second hierarchical level function with the second graphical element.
75. A method of interacting with a handheld device, said method comprising: recognizing selection of a first graphical icon on a writable surface, said selection performed using a writing instrument of said handheld device; in response to said selection, audibly rendering a listing of first options associated with said graphical icon wherein said first options are operable to-invoked by said handheld device; and in response to a selection of one of said first options, invoking said one of said first options.
76. A method as described in Claim 75 wherein said first options comprise at least one application to be invoked.
39
77. A method as described in Claim 75 wherein said one of said first options is an application program resident on said handheld device.
78. A method as described in Claim 75 wherein said audibly rendering said listing of said first options comprises audibly rendering, one at a time, each of said first options in a round-robin fashion, in response to selections of said first graphical icon by said writing instrument.
79. A method as described in Gaim 78 further comprising identifying a selection of said one of said first options in response to said writing instrument selecting a portion of said first graphical icon after said one of said first options is audibly rendered.
80. A method as described in Claim 79 wherein said portion of said first graphical icon is a symbol of a checkmark.
81. A method as described in Claim 79 wherein said selecting said portion comprises recognizing a gesture made by a user with said handheld device.
82. A method as described in Claim 75 wherein said first graphical icon is user written on said surface and further comprising automatically identifying said first graphical icon and wherein said automatically identifying said first graphical icon is performed using a processor of said handheld device.
83. A method as described in Claim 75 wherein said first graphical icon is pre printed on said surface.
84. A method as described in Claim 75 wherein said first graphical icon is a menu item and wherein said first options are submenu items within a hierarchy of options operable to be invoked by said handheld device.
85. A method as described in Claim 75 wherein said first options comprise an option having an associated second graphical icon and further comprising: recognizing selection of said second graphical icon on said writable surface, said selection performed using said writing instrument of said handheld device; in response to said selection, audibly rendering a listing of second options associated with said second graphical icon wherein said second options are operable to be invoked by said handheld device; and in response to a selection of one of said second options, invoking said one of said second options.
86. A method as described in Claim 85 wherein said second options comprise at least one application to be invoked.
87. A method as described in Claim 85 wherein said one of said second options is an application program resident on said handheld device.
88. A method as described in Claim 85 wherein said audibly rendering said listing of said second options comprises audibly rendering, one at a time, each of said second options in a round-robin fashion, in response to selections of said second graphical icon by said writing instrument.
40
89. A method as described in Claim 88 further comprising identifying selection of said one of said second options by responding to said writing instrument selecting a portion of said second graphical icon after said one of said second options is audibly rendered.
90. A method as described in Claim 85 wherein said second graphical icon is user written on said surface and further comprising automatically identifying said second graphical icon and wherein said automatically identifying said second graphical icon is performed using a processor of said handheld device.
91. A method as described in Claim 75 wherein said one of said first options comprises a text recognition function wherein said handheld device is configured to recognize the end of a written word by recognizing the user tapping the last character of the word.
92. A method as described in Claim 75 wherein said one of said first options comprises a text recognition function wherein said handheld device is configured to recognize the end of a written word by recognizing the user drawing a box or circle around the word.
93. A method as described in Claim 75 wherein said one of said first options comprises a dictionary function wherein said handheld device is configured to recognize a user written word and audibly render a definition related to said user written word.
94.' A method as described in Claim 75 wherein said one of said first options comprises a calculator function wherein said handheld device is configured to recognize a plurality of user written graphic elements, and wherein the plurality of graphic elements comprise a plurality of numbers and mathematical operators, and wherein said handheld device is configured to recognize the selection of a first number, a first mathematical operator, a second number, and a second mathematical operator, wherein the first number, the first mathematical operator, and the second mathematical operator together form a math problem, and audibly render at least one audio output that comprises the answer to the math problem.
95. A method as described in Claim 75 wherein said one of said first options comprises a translator function wherein said handheld device is configured to recognize a plurality of user written graphic elements, and wherein a first graphic element comprises a name of a language and a second graphic element comprises a word that is in a language that is different than the language, and wherein said handheld device is configured to recognize the selection of the word and to recognize the selection of the name of the language and audibly render the word in the language.
96. A method as described in Claim 75 wherein said one of said first options comprises a word scramble function wherein said handheld device is configured to recognize a plurality of user written graphic elements comprising words of a sentence, and wherein said handheld device is configured to recognize the sequential selection of the words and to audibly
41 render the sentence upon a successful sequential selection of the words of the sentence.
97. A method as described in Qaim 75 wherein said one of said first options comprises an alarm clock function wherein said handheld device is configured to recognize a user written alarm time and audibly render an alarm related to said user written alarm time.
98. A method as described in Claim 85 wherein said one of said first options comprises a phone list function, and wherein said audibly rendered listing of said second options comprises accessing a phone number, adding a phone number, or deleting a phone number, and in response to a selection of one of said second options, invoking said one of said second options of said phone list function.
99. A method as described in Qaim 75 wherein said handheld device comprises a processor in communication with a remote computer system external to the handheld device.
100. A method as described in Claim 96 wherein said remote computer system is a server and said processor uses wireless communication to interact with said server.
42
PCT/US2005/017883 2004-06-03 2005-05-20 User created interactive interface WO2005122130A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CA002527240A CA2527240A1 (en) 2004-06-03 2005-05-20 User created interactive interface
JP2006525552A JP2007504565A (en) 2004-06-03 2005-05-20 Interactive interface created by the user
EP05753583A EP1665222A4 (en) 2004-06-03 2005-05-20 User created interactive interface

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US57737804P 2004-06-03 2004-06-03
US10/861,243 2004-06-03
US10/861,243 US20060033725A1 (en) 2004-06-03 2004-06-03 User created interactive interface
US60/577,378 2004-06-03

Publications (3)

Publication Number Publication Date
WO2005122130A2 WO2005122130A2 (en) 2005-12-22
WO2005122130A3 WO2005122130A3 (en) 2006-08-10
WO2005122130B1 true WO2005122130B1 (en) 2006-09-28

Family

ID=35503812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/017883 WO2005122130A2 (en) 2004-06-03 2005-05-20 User created interactive interface

Country Status (5)

Country Link
EP (1) EP1665222A4 (en)
JP (1) JP2007504565A (en)
KR (1) KR100805259B1 (en)
CA (1) CA2527240A1 (en)
WO (1) WO2005122130A2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100807307B1 (en) * 2006-07-10 2008-02-28 한국전자통신연구원 Spoken dialog system for human computer interface and response method therein
KR101116689B1 (en) * 2010-02-18 2012-06-12 주식회사 네오랩컨버전스 Apparatus and method for outputting an information based on dot-code using gesture recognition

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4337375A (en) * 1980-06-12 1982-06-29 Texas Instruments Incorporated Manually controllable data reading apparatus for speech synthesizers
SE9202086D0 (en) * 1992-07-03 1992-07-03 Ericsson Telefon Ab L M DEVICE TO SIMPLIFY ORDERING TELEPHONE SERVICES
US5546565A (en) * 1993-06-21 1996-08-13 Casio Computer Co., Ltd. Input/output apparatus having a pen, and method of associating and processing handwritten image data and voice data
US6218964B1 (en) 1996-09-25 2001-04-17 Christ G. Ellis Mechanical and digital reading pen
US6456749B1 (en) * 1998-02-27 2002-09-24 Carnegie Mellon University Handheld apparatus for recognition of writing, for remote communication, and for user defined input templates
JP2002513969A (en) * 1998-05-07 2002-05-14 エイアールティー−アドヴァンスト・レコグニション・テクノロジーズ・リミテッド Handwriting and voice control of vehicle components
JP2001005599A (en) * 1999-06-22 2001-01-12 Sharp Corp Information processor and information processing method an d recording medium recording information processing program
WO2001016691A1 (en) * 1999-08-30 2001-03-08 Anoto Ab Notepad
BR0016683A (en) * 1999-12-23 2002-09-03 Anoto Ab Expense Card Purchase
US7295193B2 (en) * 1999-12-23 2007-11-13 Anoto Ab Written command
US6968311B2 (en) * 2000-07-28 2005-11-22 Siemens Vdo Automotive Corporation User interface for telematics systems
CN1518730A (en) * 2001-06-20 2004-08-04 跳蛙企业股份有限公司 Interactive apparatus using print media
US7202861B2 (en) * 2001-06-25 2007-04-10 Anoto Ab Control of a unit provided with a processor
JP2003131785A (en) * 2001-10-22 2003-05-09 Toshiba Corp Interface device, operation control method and program product
JP2003216630A (en) * 2002-01-23 2003-07-31 Canon Inc Menu control method, device and media
WO2003094489A1 (en) * 2002-04-29 2003-11-13 Nokia Corporation Method and system for rapid navigation in aural user interface
US20040229195A1 (en) * 2003-03-18 2004-11-18 Leapfrog Enterprises, Inc. Scanning apparatus

Also Published As

Publication number Publication date
WO2005122130A2 (en) 2005-12-22
EP1665222A4 (en) 2008-01-16
CA2527240A1 (en) 2005-12-22
EP1665222A2 (en) 2006-06-07
KR20060040612A (en) 2006-05-10
JP2007504565A (en) 2007-03-01
KR100805259B1 (en) 2008-02-20
WO2005122130A3 (en) 2006-08-10

Similar Documents

Publication Publication Date Title
CN100543835C (en) Ink correction pad
US20060033725A1 (en) User created interactive interface
US8633907B2 (en) Touch screen overlay for visually impaired persons
KR100815534B1 (en) Providing a user interface having interactive elements on a writable surface
CN105573503B (en) For receiving the method and system of the text input on touch-sensitive display device
US7719521B2 (en) Navigational interface providing auxiliary character support for mobile and wearable computers
KR100847851B1 (en) Device user interface through recognized text and bounded areas
KR100814052B1 (en) A mehod and device for associating a user writing with a user-writable element
CN101164054A (en) Auto-suggest lists and handwritten input
JP2013515295A (en) Data input system and method
US20020180797A1 (en) Method for a high-speed writing system and high -speed writing device
US20100306718A1 (en) Apparatus and method for unlocking a locking mode of portable terminal
JP2005536807A (en) Universal display keyboard, system, and method
CN102141889A (en) Typing assistance for editing
EP2183740A2 (en) Improved data entry system
JP2016061855A (en) Audio learning device and control program
AU2019236601A1 (en) Character Input Device
KR20060109289A (en) Data input panel character conversion
US9250710B2 (en) User interface for a hand held device
WO2005122130B1 (en) User created interactive interface
KR20030022784A (en) Method and apparatus for editing images representing ideas
KR101845780B1 (en) Method for providing sign image search service and sign image search server used for same
CN100511413C (en) User created interactive interface
KR101782109B1 (en) Apparatus and method for keeping of goods
JP2002318655A (en) Telephone set with character input function and character input program

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2527240

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2006525552

Country of ref document: JP

AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

WWE Wipo information: entry into national phase

Ref document number: 2005753583

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 20058004425

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 1020057025340

Country of ref document: KR

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWP Wipo information: published in national office

Ref document number: 1020057025340

Country of ref document: KR

WWP Wipo information: published in national office

Ref document number: 2005753583

Country of ref document: EP

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE