GB2315859A - Input device simulating touch screen - Google Patents

Input device simulating touch screen Download PDF

Info

Publication number
GB2315859A
GB2315859A GB9715615A GB9715615A GB2315859A GB 2315859 A GB2315859 A GB 2315859A GB 9715615 A GB9715615 A GB 9715615A GB 9715615 A GB9715615 A GB 9715615A GB 2315859 A GB2315859 A GB 2315859A
Authority
GB
United Kingdom
Prior art keywords
image
input device
real
user
dimensional surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9715615A
Other versions
GB2315859B (en
GB9715615D0 (en
Inventor
Donald Smith
Pat Orchard
Gordon Freedman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsemi Semiconductor ULC
Original Assignee
Mitel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitel Corp filed Critical Mitel Corp
Publication of GB9715615D0 publication Critical patent/GB9715615D0/en
Publication of GB2315859A publication Critical patent/GB2315859A/en
Application granted granted Critical
Publication of GB2315859B publication Critical patent/GB2315859B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

An input device comprises a two-dimensional surface 12 capable of presenting predefined regions 13 and 14 to a user, a video camera 6 directed at said surface for creating real-time images thereof, a device using a Rybak method to analyse the real-time images to identify a human hand and therefore decide which predefined region is being pointed to by the user, and a device for outputting a signal dependent on the identified region pointed to by the user. The input device can act as an alternative pointing device for a computer, for example.

Description

INPUT DEVICE SIMULATING TOUCH SCREEN This invention relates to a input device, and more particularly an input device capable of simulating a touch screen.
Some means must be found for users to provide input to computers. The keyboard is the most obvious means, but is not always practical because it requires typing ability and is not well suited to situations where rapid input is required. The touch screen has been developed as one solution to this problem. This allows a user to touch a part of a screen to indicate a certain input. For example, when moving through menus, the user can touch certain portions of the screen to select desired menu choices.
The problem with touch screens is that they are expensive to produce and also they only work on cathode ray tubes.
There are situations where "touch screen" capability is desirable on static devices, for example posters and the like.
An object of the invention is to alleviate this problem.
According to the present invention there is provided an input device comprising a two-dimensional surface capable of presenting predefined regions to a user, a video camera directed at said two-dimensional surface for creating real-time images thereof, means for analyzing said realtime images to identify which pre-defined region is being pointed to by a user, and means for outputting a single dependent on the identified region pointed to by the user.
The device preferably includes a pattern recognition device capable of recognising the form of a human hand, for example using a modified Rybak method.
The two-dimensional surface can be a computer screen or alternatively a static display, such as a poster or the like.
The invention will now be described in more detail, by way of example, only with reference to the accompanying drawings, in which: Figure 1 is a block diagram of an input device in accordance with the invention; Figure 2 is a diagrammatic view of one embodiment of the invention; Figure 3 illustrates the process for determining the region pointed to; Figure 4 is a block diagram of a system in accordance with the invention; Figure 5 is a more diagrammatic view of an alternative arrangement; and Figure 6 is a block diagram of the arrangement shown in Figure 5.
Referring to Figure 1, a central processing unit 1, for example an Intel Pentium microprocessor, is connected to a bus 2, to which is attached a random access memory 3, an output 4 for connection to a device to be controlled, for example a computer (not shown), and a camera interface 5.
A still-frame video camera 6 is connected to the camera interface 5. The camera 6 is fixedly mounted so as to face the computer screen 7 and generate a still-frame video images thereof in real time. The camera interface 5 digitizes the output of the camera 6 for connection to the bus 2. The images are stored in the RAM 3 as data.
7 In use, the video camera 5 continually generates a series of still frame images of the computer screen 7.
The CPU 7, under instructions from a program also stored in RAM 3, continually analyzes the stored digital images to look for the shape of a human hand using known pattern recognition techniques, such as a modified Rybak method.
When a human hand is detected, the CPU 3 then determines its terminal location relative to the borders of the image and matches this location with predefined areas on the image. The CPU 1 then instructs the output device 4 to generate a signal identifying which predefined area is being pointed to by the identified hand. This output signal can then be passed directly to a computer as X-Y co-ordinates in the same way as a signal from a mouse or other conventional pointing device.
Figure 2 shows an alternative embodiment wherein a camera 6 is directed at a billboard 12 having two well-defined regions one representing a graphic 13 and the other text 14. As shown in Figure 3, the camera 6 forms a clean image 15 and a current image 16, which includes the hand 17 of a user. The system subtracts the current image from the clean image to leave only the hand 17. The terminal point 18 of the difference image, here point (100, 75), is taken to be the point indicated 19, and the system is calibrated 19 to know that this relates to a particular predefined region, in this case the graph portion 13 of the image.
Figure 4 illustrates a device for implementing this process. The output of the camera 6 is sent to a clean image buffer 21 and a current image buffer 22. The outputs of the buffers 21, 22 are subtracted in subtractor circuit 20 and passed to a device 12, which implements a modified Rybak method to recognize the shape of the human hand. The Rybak method, described in a paper entitled "BMV: Behavioural Model of Active Visual Perception and Invariant Image Recognition" by I.A. Rybak, A.V. Golovan, V.I. Gusakova, N.A. Shetsova and L.N. Podladchikovs located on the World Wide Web at http://www.voicenet.com/ rybak/vnc.html, permits the recognition of certain shapes, such as the form of a pointing hand.
Figure 5 shows a system for monitoring a computer screen, which can be used as an input device for the computer.
The system could, for example, be used in public places such as airports where users are require to make menu choices. The camera 6, which is fixed in position, is directed onto a computer monitor 31 by means of a mirror 30.
First a snapshot" of the computer screen is taken through the operating system of the computer to create a computer image 32, referred to as the OS image. This is done automatically and can be easily achieved using the builtin capability of the operating system. For example, in the case of the Windows operating system, pressing the key Print Scrn causes a bitmap image of the screen to be passed the Windows clipboard, and it is a simple matter to achieve this internally under program control.
Next a reference image, which will be referred to as the camera image 33, is taken of the clean screen with the camera 6. A translation function 34 is then derived to translate the camera image to the OS image. This is referred to as the delta function A.
The same procedure is repeated in the presence of an object such as a hand 34 and the delta function 35 of the current image is formed, which is called the TRANSLATED image 36. The TRANSLATED image is then subtracted from the current OS image 37 to form a STRIPPED image 38, which contains only the difference between the calibration image and the present image.
This procedure makes feature recognition easier using the modified Rybak method referred to above. It can also be used in conjunction with the other examples referred to herein.
The modified Rybak method is used to determine where the hand is pointing to 39, and this point is referred to as the HOT SPOT. The X, Y co-ordinates of the HOT SPOT are then determined and these are sent to the operating system as a mouse position 40. The current image 34 can be taken as often as necessary.
Figure 6 shows an arrangement for implementing the above method. The snapshot of the screen created by the operating system 50 is stored in buffer 41 and that by the camera 6 stored in buffer 42. The delta function is created in block 43 and passed through block 44 to subtractor 45 where it is subtracted from the OS image create the STRIPPED image, which is passed to the modified Rybak processing unit 46. Unit 47 derives the X,Y coordinates of the hot spot and passes these to the operating system of the computer as if they were a mouse input 48.
The invention thus permits a computer screen or other display device, including passive display devices, to be given a "touch screen" capability in a particularly economical manner.

Claims (8)

1. An input device characterized in that it comprises a two-dimensional surface capable of presenting predefined regions to a user, a video camera directed at said twodimensional surface for creating real-time images thereof, means for analyzing said real-time images to identify which pre-defined region is being pointed to by a user, and means for outputting a signal dependent on the identified region pointed to by the user.
2. An input device as claimed in claim 2, characterized in that it further comprises pattern recognition means for recognizing the form of a human hand in said real-time image, and said analyzing means generating said output signal according to the location of said hand form on said two-dimensional surface.
3. An input device as claimed in claim 2, characterized in that said recognizing means uses a modified of the Rybak method of pattern recognition.
4. An input device as claimed in any one of claims 1 to 3, characterized in that it further comprises means for storing a clean image of said two-dimensional surface, and wherein said analyzing means subtracts a real-time image from said stored clean image to create a difference image which is analyzed to find the location of said hand form relative to said clean image.
5. An input device as claimed in any one of claims 1 to 4, characterized in that said two-dimensional surface is a computer screen and said analyzing means subtracts said real time image from a snapshot image produced by an operating system thereof to produce a delta function which translates said current image to the operating system image, and said analyzing means compares a real-time image translated with said delta function with the operating system image to obtain a stripped image that said recognition means uses to identify the shape of a human hand.
6. An input device as claimed in any one of claims 1 to 5, characterized in that said output signal is operative to control the operation of a computer.
7. An input device as claimed in any one of claims 1 to 6, characterized in that said video camera is a stillframe video camera.
8. An input device substantially as herein described, with reference to the accompanying drawings.
GB9715615A 1996-07-29 1997-07-25 Input device simulating touch screen Expired - Fee Related GB2315859B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CA 2182238 CA2182238A1 (en) 1996-07-29 1996-07-29 Input device simulating touch screen

Publications (3)

Publication Number Publication Date
GB9715615D0 GB9715615D0 (en) 1997-10-01
GB2315859A true GB2315859A (en) 1998-02-11
GB2315859B GB2315859B (en) 2000-10-18

Family

ID=4158676

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9715615A Expired - Fee Related GB2315859B (en) 1996-07-29 1997-07-25 Input device simulating touch screen

Country Status (2)

Country Link
CA (1) CA2182238A1 (en)
GB (1) GB2315859B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0974948A1 (en) * 1998-07-20 2000-01-26 Nec Corporation Apparatus and method of assisting visually impaired persons to generate graphical data in a computer
WO2002061583A2 (en) * 2001-01-31 2002-08-08 Hewlett-Packard Company A system and method for robust foreground and background image data separation for location of objects in front of a controllable display within a camera view

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0477910A2 (en) * 1990-09-28 1992-04-01 Ezel Inc. Image processing based computer input apparatus
US5168531A (en) * 1991-06-27 1992-12-01 Digital Equipment Corporation Real-time recognition of pointing information from video
EP0571702A2 (en) * 1992-05-26 1993-12-01 Takenaka Corporation Hand pointing type input unit and wall computer module
WO1995034881A1 (en) * 1994-06-15 1995-12-21 Daniel Marcel Platzker Interactive projected video image display system
WO1996034332A1 (en) * 1995-04-28 1996-10-31 Matsushita Electric Industrial Co., Ltd. Interface device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0477910A2 (en) * 1990-09-28 1992-04-01 Ezel Inc. Image processing based computer input apparatus
US5168531A (en) * 1991-06-27 1992-12-01 Digital Equipment Corporation Real-time recognition of pointing information from video
EP0571702A2 (en) * 1992-05-26 1993-12-01 Takenaka Corporation Hand pointing type input unit and wall computer module
WO1995034881A1 (en) * 1994-06-15 1995-12-21 Daniel Marcel Platzker Interactive projected video image display system
WO1996034332A1 (en) * 1995-04-28 1996-10-31 Matsushita Electric Industrial Co., Ltd. Interface device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0974948A1 (en) * 1998-07-20 2000-01-26 Nec Corporation Apparatus and method of assisting visually impaired persons to generate graphical data in a computer
US6140913A (en) * 1998-07-20 2000-10-31 Nec Corporation Apparatus and method of assisting visually impaired persons to generate graphical data in a computer
WO2002061583A2 (en) * 2001-01-31 2002-08-08 Hewlett-Packard Company A system and method for robust foreground and background image data separation for location of objects in front of a controllable display within a camera view
WO2002061583A3 (en) * 2001-01-31 2003-11-13 Hewlett Packard Co A system and method for robust foreground and background image data separation for location of objects in front of a controllable display within a camera view

Also Published As

Publication number Publication date
GB2315859B (en) 2000-10-18
CA2182238A1 (en) 1998-01-30
GB9715615D0 (en) 1997-10-01

Similar Documents

Publication Publication Date Title
US6154558A (en) Intention identification method
US9195345B2 (en) Position aware gestures with visual feedback as input method
US6898307B1 (en) Object identification method and system for an augmented-reality display
US7737956B2 (en) Electronic device and method providing a cursor control
EP1087327B1 (en) Interactive display presentation system
EP0554492B1 (en) Method and device for optical input of commands or data
US6292171B1 (en) Method and apparatus for calibrating a computer-generated projected image
US7830362B2 (en) Laser and digital camera computer pointer device system
KR960043807A (en) Coordinate input device and method, and information processing device
US20030226968A1 (en) Apparatus and method for inputting data
JP2004185007A (en) Method of controlling display device
WO1992009944A3 (en) User interface having simulated devices
KR19990011180A (en) How to select menu using image recognition
JP2001092995A (en) Extended reality display system and selective display method for processor generation image
KR20070036075A (en) Touch-down feed-forward in 3-d touch interaction
JPH08171647A (en) Acoustic output method for image information
US5583538A (en) Image display apparatus
Sandnes Lost in OCR-Translation: pixel-based text reflow to the rescue: magnification of archival raster image documents in the browser without horizontal scrolling
KR19990045918A (en) Method and apparatus for providing pointer implemented with image function
GB2315859A (en) Input device simulating touch screen
Kim et al. Eye mouse: mouse implementation using eye tracking
KR0171847B1 (en) Radio telemetry coordinate input method and device thereof
JP4728540B2 (en) Image projection device for meeting support
Sanghi et al. A fingertip detection and tracking system as a virtual mouse, a signature input device and an application selector
JPH07121152A (en) Pointing method by using mouse

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20120725