EP1449158A1 - Virtual input using a pen - Google Patents

Virtual input using a pen

Info

Publication number
EP1449158A1
EP1449158A1 EP20020790331 EP02790331A EP1449158A1 EP 1449158 A1 EP1449158 A1 EP 1449158A1 EP 20020790331 EP20020790331 EP 20020790331 EP 02790331 A EP02790331 A EP 02790331A EP 1449158 A1 EP1449158 A1 EP 1449158A1
Authority
EP
European Patent Office
Prior art keywords
pen
md
microdisplay
st
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP20020790331
Other languages
German (de)
French (fr)
Inventor
Fritz Seytter
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to EP20010128037 priority Critical patent/EP1315120A1/en
Priority to EP01128037 priority
Application filed by Siemens AG filed Critical Siemens AG
Priority to PCT/EP2002/012294 priority patent/WO2003046821A1/en
Priority to EP20020790331 priority patent/EP1449158A1/en
Publication of EP1449158A1 publication Critical patent/EP1449158A1/en
Application status is Withdrawn legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text

Abstract

The invention relates to a method for inputting information using a pen on a surface (SO). According to said method, the displacement path of a pen (ST) is represented in graphic form on a transparent display or a transparent microdisplay (MD), whereby means allowing the pen (ST) and the graphics on the display (MD) to be locally coupled are provided.

Description

description

Virtual pen input

The invention relates to a process for pen input on a surface, wherein the movement of the pen is displayed as a graphic on a display.

In miniaturized devices Information Technology Working the need is, despite the small, unobtrusive and light devices to have a large graphical screen available. Next the user wants to enter data in a natural way possible. Future devices could therefore be equipped with a see-through microdisplay, are projected in the graphical information about the natural environment. Such micro-displays are available for example from "The Micro Display Corporation *. A graphic pen input is on such a projected screen difficult to implement.

There are technologies to pen input on any surface are known in which on the one hand active pins with a built-in motion detection (for example, see www. Anoto. Com or www. Com. N. Com) may be used, or in which the other hand (by Peiltechniken for example, see www . com) virtual-ink. a position of recognition of a passive stylus is relative to a detection device. These active and passive posts, a data transmission takes place, for example, to a notebook, on which the detected movement is represented graphically, for example by a line.

The invention has the object of specifying a method for pen input Toggle particular for a review of microdisplay, which allows the user a natural handling. This object is achieved by the features indicated in the patent claim.

An advantage of the method is that the input to other persons can not be seen or intercepted.

In the following the invention will be described with reference to an illustrated in the drawings embodiment. show Here

1 shows the use of a micro-display in connection with a headset, and

Figure 2 process for pen input of the present invention.

The inventive method for pen input allows such represent the evaluated movement or the movement of the pen on a see-through microdisplay so that the visual impression is created as if the pen is actually write. In fact, the pen moves on the respective pad or surface without leaving a trace there. The motion sequence is represented as it were, by a so-called "virtual ink * flowing out like a real pen from the tip of the pen.

In Figure 1, a microdisplay MD is shown, which is secured, for example, on a headset HS. If the microdisplay MD is configured as a transparent microdisplay, the user of the headset HS looks about in front of it real background graced on the microdisplay MD projected or displayed image. The user sees, for example, one that appears on the display MD text projected above the natural background.

2 shows a writing surface SO is shown, on which a pin followers STV is that detects the movement of a pin ST. The writing surface SO, so to speak constitutes the detection region on which the movement sequence of the pin ST can be evaluated.

In the process of this invention, means are provided whereby, in phantom microdisplay MD occurs a stationary smooth coupling pin ST and the image generated by it.

In the see-through display or Clear-microdisplay MD graphic input via the stylus ST is shown. It should be noted here that the image appears as a projection on the microdisplay MD, during this graph to the real writing surface SO is not recognizable. Only the user or the viewer that the whole scenario, which means projection image on the microdisplay MD and the real pen ST, sees it were a writing pen, which draws a virtual trail of ink to be.

The invention is to make the detected movement of the pin so on the microdisplay MD visible so that the virtual impression, as would write, even though he, without leaving a trace behind themselves, moved on the base of the pin.

To carry out the invention, the coupled in its movement to the eye microdisplay MD is calibrated by the user, for example, tapping a plurality of visible points on the display one or MD. This happens naturally tapping with the stylus on the writing surface ST SO, that is, on the projection of the points on the writing surface SO. A compensation of head movement then causes the virtual font or line drawing of the user who indeed sees the writing surface SO by the microdisplay MD, Smiling's in the same place the write upper SO remains as he can continue to write or draw, and the virtual ink is always flowing from the tip of the stylus ST. The user sees the display MD a combination of the real and the virtual pen ink that are spatially coupled. This creates the impression of a writing pen ST.

An outside observer can track the movement of the pen only, but neither would the resultant graphics or writing, hen still a nascent drawing se-.

Claims

claim
A process for pen input on a surface (SO),
- in which the movement of a pin (ST) and artwork on a see-through display or microdisplay Clear-(MD) is represented, and
- are provided with the means whereby the pin (ST) and the graphics on the display (MD) are spatially coupled.
EP20020790331 2001-11-26 2002-11-04 Virtual input using a pen Withdrawn EP1449158A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP20010128037 EP1315120A1 (en) 2001-11-26 2001-11-26 Pen input system
EP01128037 2001-11-26
PCT/EP2002/012294 WO2003046821A1 (en) 2001-11-26 2002-11-04 Virtual input using a pen
EP20020790331 EP1449158A1 (en) 2001-11-26 2002-11-04 Virtual input using a pen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP20020790331 EP1449158A1 (en) 2001-11-26 2002-11-04 Virtual input using a pen

Publications (1)

Publication Number Publication Date
EP1449158A1 true EP1449158A1 (en) 2004-08-25

Family

ID=8179350

Family Applications (2)

Application Number Title Priority Date Filing Date
EP20010128037 Withdrawn EP1315120A1 (en) 2001-11-26 2001-11-26 Pen input system
EP20020790331 Withdrawn EP1449158A1 (en) 2001-11-26 2002-11-04 Virtual input using a pen

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP20010128037 Withdrawn EP1315120A1 (en) 2001-11-26 2001-11-26 Pen input system

Country Status (3)

Country Link
US (1) US20050103536A1 (en)
EP (2) EP1315120A1 (en)
WO (1) WO2003046821A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090098248A1 (en) * 2005-05-31 2009-04-16 Lex De Boer Novel Process for Enzymatic Acrylamide Reduction in Food Products
US8386963B2 (en) * 2009-05-28 2013-02-26 Microsoft Corporation Virtual inking using gesture recognition
KR20140055173A (en) 2012-10-30 2014-05-09 삼성전자주식회사 Input apparatus and input controlling method thereof
JP6233314B2 (en) * 2012-11-09 2017-11-22 ソニー株式会社 Information processing apparatus, information processing method, and computer-readable recording medium

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4344050C1 (en) * 1993-12-23 1995-03-09 Henry Rudolph Entry method and entry apparatus for computer terminals for the purpose of protecting against covert observation and for ensuring the privacy of a user
JPH086708A (en) * 1994-04-22 1996-01-12 Canon Inc Display device
US5696521A (en) * 1994-06-22 1997-12-09 Astounding Technologies (M) Sdn. Bhd. Video headset
TW394879B (en) * 1996-02-09 2000-06-21 Sega Enterprises Kk Graphics processing system and its data input device
US6128007A (en) * 1996-07-29 2000-10-03 Motorola, Inc. Method and apparatus for multi-mode handwritten input and hand directed control of a computing device
US6710770B2 (en) * 2000-02-11 2004-03-23 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US6771294B1 (en) * 1999-12-29 2004-08-03 Petri Pulli User interface
WO2001056007A1 (en) * 2000-01-28 2001-08-02 Intersense, Inc. Self-referenced tracking
EP1128318A3 (en) * 2000-02-21 2002-01-23 Cyberboard A/S Position detection device
WO2002048642A2 (en) * 2000-11-19 2002-06-20 Canesta, Inc. Method for enhancing performance in a system utilizing an array of sensors that sense at least two-dimensions
US20020118181A1 (en) * 2000-11-29 2002-08-29 Oral Sekendur Absolute optical position determination

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO03046821A1 *

Also Published As

Publication number Publication date
WO2003046821A1 (en) 2003-06-05
US20050103536A1 (en) 2005-05-19
EP1315120A1 (en) 2003-05-28

Similar Documents

Publication Publication Date Title
Mistry et al. SixthSense: a wearable gestural interface
Mackay Augmented reality: linking real and virtual worlds: a new paradigm for interacting with computers
Bowman et al. 3d user interfaces: New directions and perspectives
JP6116064B2 (en) Gesture reference control system for vehicle interface
US9927881B2 (en) Hand tracker for device with display
Burdea et al. Virtual reality technology
Henderson et al. Evaluating the benefits of augmented reality for task localization in maintenance of an armored personnel carrier turret
US6771294B1 (en) User interface
Kray et al. User-defined gestures for connecting mobile phones, public displays, and tabletops
US9658765B2 (en) Image magnification system for computer interface
EP1821182B1 (en) 3d pointing method, 3d display control method, 3d pointing device, 3d display control device, 3d pointing program, and 3d display control program
US20180292648A1 (en) External user interface for head worn computing
EP2946264B1 (en) Virtual interaction with image projection
Poupyrev et al. Developing a generic augmented-reality interface
ES2535364T3 (en) Eye control of computer equipment
Billinghurst et al. Collaborative mixed reality
US20100201623A1 (en) Method and system for displaying information
US20060181519A1 (en) Method and system for manipulating graphical objects displayed on a touch-sensitive display surface using displaced pop-ups
Mistry et al. WUW-wear Ur world: a wearable gestural interface
Anthes et al. State of the art of virtual reality technology
JP4294668B2 (en) Point diagram display device
US20160025977A1 (en) External user interface for head worn computing
JP2011527478A (en) Multi-touch touch screen with pen tracking
US9165381B2 (en) Augmented books in a mixed reality environment
US6898307B1 (en) Object identification method and system for an augmented-reality display

Legal Events

Date Code Title Description
AK Designated contracting states:

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LI LU MC NL PT SE SK TR

17P Request for examination filed

Effective date: 20040301

17Q First examination report

Effective date: 20050510

18D Deemed to be withdrawn

Effective date: 20061003