WO2006076118B1 - Interactive device and method - Google Patents

Interactive device and method

Info

Publication number
WO2006076118B1
WO2006076118B1 PCT/US2005/045342 US2005045342W WO2006076118B1 WO 2006076118 B1 WO2006076118 B1 WO 2006076118B1 US 2005045342 W US2005045342 W US 2005045342W WO 2006076118 B1 WO2006076118 B1 WO 2006076118B1
Authority
WO
WIPO (PCT)
Prior art keywords
user
recited
function
selectable item
written selectable
Prior art date
Application number
PCT/US2005/045342
Other languages
French (fr)
Other versions
WO2006076118A2 (en
WO2006076118A3 (en
Inventor
James Marggraff
Original Assignee
Leapfrog Entpr Inc
James Marggraff
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/034,491 external-priority patent/US7831933B2/en
Priority claimed from US11/264,955 external-priority patent/US7853193B2/en
Application filed by Leapfrog Entpr Inc, James Marggraff filed Critical Leapfrog Entpr Inc
Priority to CA002535505A priority Critical patent/CA2535505A1/en
Publication of WO2006076118A2 publication Critical patent/WO2006076118A2/en
Publication of WO2006076118A3 publication Critical patent/WO2006076118A3/en
Publication of WO2006076118B1 publication Critical patent/WO2006076118B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

A method and device (FIG. l)for audibly instructing a user to interact with a function. Interactive device (100) includes processor (112), memory unit (114), audio output device (116), writing element (118) and optical detector (120) within housing (130). Ih one embodiment, in response to recognizing a particular user-written selectable item as being associated with a particular function, the interactive device audibly renders an instructional message related to the operation of the function. In another embodiment, in response to recognizing a particular user-written selectable item as being associated with a particular function, the interactive device (FIG. 1) immediately executes the function.

Claims

AMENDED CLAIMS Received by the International Bureau on 11 December 2006 (11.12.06)
1. A method for audibly instructing a user to interact with a function, said method comprising: associating a function with a user-written selectable item; recognizing said user-written selectable item on a surface; and in response to said recognizing said user-written selectable item, audibly rendering a first instructional message related to the operation of said function without requiring further interaction from said user, wherein said audibly rendering said first instructional message is performed by an audio output device of a pen computer.
2. The method as recited in claim 1 wherein said first instructional message directs said user to draw at least one user interface element that enables said user to interface with said function.
3. The method as recited in claim 2 further comprising: determining whether a writing has been drawn on said surface within a predetermined time period since said audibly rendering said first instructional message; and provided no writing has been drawn on said surface within said predetermined time period, repeating said first instructional message.
4. The method as recited in claim 2 further comprising: determining whether said user interface element has been properly drawn on said surface; and provided said user interface element has not been properly drawn on said surface, audibly rendering a second instructional message comprising a hint about a manner in which said user should properly draw said user interface element.
5. The method as recited in claim 4 further comprising, provided said user interface element has been properly drawn, audibly rendering a third instructional message.
6. The method as recited in claim 4 further comprising, provided said user interface element has been properly drawn, executing said function.
7. The method as recited in claim 1 further comprising, in response to a user interaction with said user-written selectable item, repeating said first instructional
63 message.
8. The method as recited in claim 1 further comprising executing said function prior to audibly rendering said first instructional message.
9. The method as recited in claim 1 wherein said recognizing said user- written selectable item is performed in response to detecting a termination event indicating that said user-written selectable item is complete.
10. The method as recited in claim 1 wherein said recognizing said user-written selectable item comprises: optically recording positions of the tip of a pen computer; and performing image recognition of said positions to recognize said user-written selectable item.
11. A method for automatically executing a function, said method comprising: associating a function with a user-written selectable item; recognizing said user- written selectable item on a surface; and in response to said recognizing said user- written selectable item, immediately executing said function without first requiring further interaction between a user and said surface.
12. The method as recited in claim 11 further comprising audibly rendering an instructional message related to the operation of said function without first requiring further interaction between said user and said surface.
13. The method as recited in claim 12 wherein said audibly rendering said instructional message is performed in response to the lapsing of a predetermined time period without user interaction with said surface.
14. The method as recited in claim 11 further comprising, in response to a user interaction with said user-written selectable item, re-executing said function.
15. The method as recited in claim 11 wherein said recognizing said user- written selectable item is performed in response to detecting a termination event indicating
64 that said user-written selectable item is complete.
16. The method as recited in claim 11 wherein said immediately executing said function without first requiring further interaction between a user and said surface comprises immediately executing said function without first requiring auxiliary writing by said user on said surface.
17. A computer-usable medium having computer-readable program code embodied therein for causing a computer system to perform a method for audibly instructing a user to interact with a function, said method comprising: associating a function with a user-written selectable item; recognizing said user-written selectable item on a surface; and in response to said recognizing said user-written selectable item, audibly rendering a first instructional message related to the operation of said function without requiring further interaction from said user wherein said audibly rendering said first instructional message is performed by an audio output device of a pen computer.
18. The computer-usable medium as recited in claim 17 wherein said first instructional message directs said user to draw at least one user interface element that enables said user to interface with said function.
19. The computer-usable medium as recited in claim 18, wherein said method further comprises: determining whether a writing has been drawn on said surface within a predetermined time period since said audibly rendering said first instructional message; and provided no writing has been drawn on said surface within said predetermined time period, repeating said first instructional message.
20. The computer-usable medium as recited in claim 18, wherein said method further comprises: determining whether said user interface element has been properly drawn on said surface; and provided said user interface element has not been properly drawn on said surface, audibly rendering a second instructional message comprising a hint about a manner in which said user should properly draw said user interface element.
65
21. The computer-usable medium as recited in claim 20, wherein said method further comprises, provided said user interface element has been properly drawn, audibly rendering a third instructional message.
22. The computer-usable medium as recited in claim 20, wherein said method further comprises, provided said user interface element has been properly drawn, executing said function.
23. The computer-usable medium as recited in claim 17, wherein said method further comprises, in response to a user interaction with said user-written selectable item, repeating said first instructional message.
24. The computer-usable medium as recited in claim 17, wherein said method further comprises executing said function prior to audibly rendering said first instructional message.
25. The computer-usable medium as recited in claim 17 wherein said recognizing said user-written selectable item is performed in response to detecting a termination event indicating that said user-written selectable item is complete.
26. The computer-usable medium as recited in claim 17 wherein said recognizing said user-written selectable item comprises: optically recording positions of the tip of a pen computer; and performing image recognition of said positions to recognize said user-written selectable item.
27. A computer-usable medium having computer-readable program code embodied therein for causing a computer system to perform a method for automatically executing a function, said method comprising: associating a function with a user-written selectable item; recognizing said user-written selectable item on a surface; and in response to said recognizing said user-written selectable item, immediately executing said function without first requiring further interaction between a user and said surface; and in response to a user interaction with said user-
66 written selectable item, executing said function.
28. The computer-usable medium as recited in claim 27, wherein said method further comprises audibly rendering an instructional message related to the operation of said function without first requiring further interaction between a user and said surface.
29. The computer-usable medium as recited in claim 28 wherein said audibly rendering said instructional message is performed in response to the lapsing of a predetermined time period without user interaction with said surface.
30. The computer-usable medium as recited in claim 27 wherein said recognizing said user-written selectable item is performed in response to detecting a termination event indicating that said user-written selectable item is complete.
31. The computer-usable medium as recited in claim 27 wherein said immediately executing said function without first requiring further interaction between a user and said surface comprises immediately executing said function without first requiring auxiliary writing by said user on said surface.
32. An interactive device comprising: a bus; an audio output device coupled to said bus; a writing element for allowing a user to write on a writable surface; an optical detector coupled to said bus for detecting positions of said writing element with respect to said writable surface; a processor coupled to said bus; and a memory unit coupled to said bus, said memory storing instructions that when executed cause said processor to implement a method for audibly instructing a user to interact with a function, said method comprising: associating a function with a user-written selectable item; recognizing said user-written selectable item on said writable surface; and in response to said recognizing said user-written selectable item, audibly rendering a first instructional message related to the operation of said function without requiring further interaction from said user, wherein said first instructional message directs said user to draw at least one user interface element that enables said user to interface with said function;
67 determining whether said user interface element has been properly drawn on said surface; provided said user interface element has not been properly drawn on said surface, audibly rendering a second instructional message comprising a hint about a manner in which said user should properly draw said user interface element; and provided said user interface element has been properly drawn, executing said function.
33. The interactive device as recited in claim 32 wherein said method further comprises: determining whether a writing has been drawn on said surface within a predetermined time period since said audibly rendering said first instructional message; and provided no writing has been drawn on said surface within said predetermined time period, repeating said first instructional message.
34. The interactive device as recited in claim 32, wherein said method further comprises, provided said user interface element has been properly drawn, audibly rendering a third instructional message.
35. The interactive device as recited in claim 32, wherein said method further comprises, in response to a user interaction with said user-written selectable item, repeating said first instructional message.
36. The interactive device as recited in claim 32, wherein said method further comprises executing said function prior to audibly rendering said first instructional message.
37. The interactive device as recited in claim 32 wherein said recognizing said user-written selectable item is performed in response to detecting a termination event indicating that said user- written selectable item is complete.
38. The interactive device as recited in claim 32 wherein said recognizing said user-written selectable item comprises: optically recording positions of a tip of said interactive device on said writable surface using said optical detector; and performing image recognition of said positions to recognize said user-written selectable item.
39. The interactive device as recited in claim 32 wherein said audibly rendering said first instructional message is performed by said audio output device.
40. An interactive device comprising: a bus; an audio output device coupled to said bus; a writing element for allowing a user to create a user-written selectable item on a writable surface; an optical detector coupled to said bus for detecting information on said writable surface; a processor coupled to said bus; a memory unit coupled to said bus, said memory storing instructions that when executed cause said processor to implement a method for automatically executing a function, said method comprising: associating a function with a user-written selectable item; recognizing said user-written selectable item on a surface; and in response to said recognizing said user-written selectable item, immediately executing said function without first requiring further interaction between a user and said surface.
41. The interactive device as recited in claim 40, wherein said method further comprises audibly rendering an instructional message related to the operation of said function without first requiring further interaction between a user and said surface.
42. The interactive device as recited in claim 41 wherein said audibly rendering said instructional message is performed in response to the lapsing of a predetermined time period without user interaction with said surface.
43. The interactive device as recited in claim 40, wherein said method further comprises, in response to a user interaction with said user-written selectable item, executing said function.
44. The interactive device as recited in claim 40 wherein said recognizing said user-written selectable item is performed in response to detecting a termination event indicating that said user-written selectable item is complete.
69
45. The interactive device as recited in claim 40 wherein said immediately executing said function without first requiring further interaction between a user and said surface comprises immediately executing said function without first requiring auxiliary writing by said user on said surface.
70
PCT/US2005/045342 2005-01-12 2005-12-14 Interactive device and method WO2006076118A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA002535505A CA2535505A1 (en) 2005-11-01 2006-02-23 Computer system and method for audibly instructing a user

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US11/034,491 2005-01-12
US11/034,491 US7831933B2 (en) 2004-03-17 2005-01-12 Method and system for implementing a user interface for a device employing written graphical elements
US11/264,955 2005-11-01
US11/264,955 US7853193B2 (en) 2004-03-17 2005-11-01 Method and device for audibly instructing a user to interact with a function

Publications (3)

Publication Number Publication Date
WO2006076118A2 WO2006076118A2 (en) 2006-07-20
WO2006076118A3 WO2006076118A3 (en) 2007-01-04
WO2006076118B1 true WO2006076118B1 (en) 2007-02-22

Family

ID=36678070

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/045342 WO2006076118A2 (en) 2005-01-12 2005-12-14 Interactive device and method

Country Status (1)

Country Link
WO (1) WO2006076118A2 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4464118A (en) * 1980-06-19 1984-08-07 Texas Instruments Incorporated Didactic device to improve penmanship and drawing skills
US5730602A (en) * 1995-04-28 1998-03-24 Penmanship, Inc. Computerized method and apparatus for teaching handwriting
EP1479058A1 (en) * 2002-02-06 2004-11-24 Leapfrog Enterprises, Inc. Write on interactive apparatus and method

Also Published As

Publication number Publication date
WO2006076118A2 (en) 2006-07-20
WO2006076118A3 (en) 2007-01-04

Similar Documents

Publication Publication Date Title
KR100847851B1 (en) Device user interface through recognized text and bounded areas
KR100806241B1 (en) User interface for written graphical device
KR100806240B1 (en) System and method for identifying termination of data entry
US7853193B2 (en) Method and device for audibly instructing a user to interact with a function
JP2007128481A5 (en)
KR100814052B1 (en) A mehod and device for associating a user writing with a user-writable element
US8791900B2 (en) Computing device notes
KR20060082405A (en) Providing a user interface having interactive elements on a writable surface
KR20080104099A (en) Input apparatus and input method thereof
WO2008013761A2 (en) Associating a region on a surface with a sound or with another region
CN104657054A (en) Learning method and device based on point reading machine
JP3075882B2 (en) Document creation and editing device
CN202093691U (en) Chinese character learning device
US20140180698A1 (en) Information processing apparatus, information processing method and storage medium
CN109254717B (en) Handwriting guide method and device for tablet, tablet and storage medium
WO2006076118B1 (en) Interactive device and method
US9965966B2 (en) Instructions on a wearable device
WO2019062205A1 (en) Electronic tablet and control method therefor, and storage medium
JP2014127040A (en) Information processing device, information processing method, and program
JP2003216324A (en) Input system
EP1111341A3 (en) Command control device and navigation device
EP1681623A1 (en) Device user interface through recognized text and bounded areas
JP2022189711A5 (en)
JPH1011538A (en) Handwritten character recognizing device and its method
CA2535505A1 (en) Computer system and method for audibly instructing a user

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05854125

Country of ref document: EP

Kind code of ref document: A2