WO2000058933A1 - Method and apparatus for visual pointing and computer control - Google Patents
Method and apparatus for visual pointing and computer control Download PDFInfo
- Publication number
- WO2000058933A1 WO2000058933A1 PCT/US2000/007118 US0007118W WO0058933A1 WO 2000058933 A1 WO2000058933 A1 WO 2000058933A1 US 0007118 W US0007118 W US 0007118W WO 0058933 A1 WO0058933 A1 WO 0058933A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- pointing
- computer
- light
- information
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
Definitions
- the present invention relates generally to the fields of processing images, pointing devices, computer-control devices, and remote presentation applications.
- Computer Display Image The display presented by a computer for human viewing on a device such as a monitor or Surface employing technologies such as VGA,
- Image Sensor An optical device such as a video camera, digital camera or other imaging technology capable of viewing a Surface.
- Warping A transformation performed on an image based on a mapping between two geometrical coordinate spaces; in the present invention Viewed Images (or portions thereof) are transformed in this manner to a predefined display coordinate space and projected/displayed images (or portions thereof) are transformed to the coordinate space of the Viewed Images (using both Warping and optional "scaling" to overcome differences in pixel-resolution); the geometric mapping is obtained through a process of calibration.
- the present invention is an apparatus and method for controlling the display, transfer and processing of information in computer systems by means of pointing.
- a user of the invention points at a surface such as a wall or whiteboard, which contains projected or written information in order to activate predetermined operations, related to that information. These operations may include, for example, those available with a commonplace computer-mouse device as well as many others.
- the typical configuration of components for the invention is depicted in Figure 1.
- the invention employs an image-sensing device [12] such as a video camera to view a surface [11], which may be used simultaneously for projecting the computer display.
- the user introduces visual input into the system by pointing, typically with a pointing device [14] such as a laser-pointer at the desired spot within the viewed area of the surface.
- a stream of digital images from the image-sensor [12] is
- the appearance of the visual input is detected and its precise location on the viewed surface is determined.
- the appearance (or motion or disappearance) of the visual input along with its location are interpreted in order to perform the operation that was intended by the user.
- the system may also determine the direction from which the user pointed. Directional information may be of use in some applications as explained below.
- FIG. 1 A remote pointing device
- FIG. 2 An example of this is shown in Figure 2.
- Site A is configured as shown in Figure 1, including means of network communication [16] (to a local network and/or Internet, for example).
- An instructor at site A uses a surface, which contains written and/or projected information.
- Sites B and C show typical "student" configurations containing an ordinary computer system equipped with communication (e.g. modem) and display (e.g. VGA) capabilities.
- the written and/or projected information at site A may be transferred to the remote participants and displayed at sites B and C using other technologies such as the Tegrit WebLeamer
- the invention causes the display of a "cursor" image, such as an arrow, at the appropriate spot in the display at sites B and C.
- a cursor such as an arrow
- the displayed cursor moves accordingly and when the instructor stops pointing the cursor disappears.
- the direction of pointing is determined. This enables displaying different cursors.
- the arrow shown at sites B and C may be displayed at an angle that approximately corresponds to the angle at which the instructor is holding the pointing device.
- Multi-modal interaction with information In conjunction with other technologies such as speech-recognition, the invention may be used in order to manipulate information objects. For example, the user may point her finger at a displayed icon that represents a data-file while saying "open this," and the system would react by displaying the
- the current invention provides the ability to identify the object to be acted upon and the impetus to do so, while another technology determines which specific action should be performed.
- This apparatus emulates computer-mouse operation using a hand-held light-generating device.
- the invention was designed for use with liquid crystal display (LCD) panels and some specific models of LCD projectors manufactured by Proxima Corp. It is a hardware solution that integrates various interconnected components, including a CCD image sensor, timing generator, analog comparator, digital-to-analog converter, a preprogrammed microprocessor, cabling and other components into a single unit that outputs mouse control commands to an attached computer.
- LCD liquid crystal display
- the best mode for carrying out the Proxima invention uses a signal-processing unit that comprises an analog to digital (A/D) comparator network, a synchronizing network and a position detector.
- the A D comparator network comprises a conventional video amplifier (emitter), analog comparator LM319A and digital to analog converter AD557.
- the synchronizing network is built from a Sanyo LC9904A timing generator, a Sanyo LB8904 high voltage drive unit and a comentional crystal resonator.
- the position detector comprises an Intel 8051 microprocessor, several logic devices including counters, inverters, and latches (74HC393, 74HC374, 74HC14) and connectors and cabling interfaces to a computer.
- the signal processing unit produces signals that control the image sensing operation of a Sanyo LC9943 charge coupled device (CCD) (see column 4, line 65 through column 5, line 9).
- CCD charge coupled device
- the CCD scans the scene it passes each pixel's intensity value to the signal-processing unit for analysis.
- the latter determines on a pixel-by-pixel basis whether the pixel can be associated with a viewed light spot produced by the light-generating device (preferably a "Pocket Laser” sold by Navitar, see column 4, line 31). This determination is made solely based on the intensity of the pixel relative to a predetermined threshold value.
- the current invention uses off-the-shelf video camera and frame-grabber technology (required for some cameras) to reproduce full-frame digital video images inside the computer's RAM (random access memory).
- the innovation of the invention is in the software that performs the image processing analysis. Rather than operating on a sequential scan of pixels one at a time the invention filters the entire digital image or selected regions of the image at once by directly accessing any desired subset of pixel values which are directly accessible from the memory buffer that stores the image with all pertinent image information (including pixel intensities and color) of a "snapshot," or frame, of the visual scene.. See Figure 4 for an example of such an image region (color information is not shown). Some information is retained over time, i.e.
- the software may also utilize additional, related information available from within the computer system. For example, the pixel values of the currently projected Computer Display Image are easily obtained for analysis and comparison to the sensed values of the Viewed Image.
- the current invention uses much more information (spatial, temporal and color) that are filtered by a large number of processing methods in order to determine what event, if any, is taking place.
- the prior art provides a subset of the functionality described herein (specifically mouse emulation) it does so with significant restrictions. Specifically, using less image information (i.e., intensity values only and/or one pixel at a time) implies, in general, reduced reliability. It is easy to produce scenarios in which the inventions described in the prior art will either not respond to the light stimulus as expected or will induce false triggering by generating events that were not intended by the user. Especially, the prior art can be expected to be reliable (with a small rate of errors) only -when used in certain environments and configurations. For example, the Proxima inventions work with LCD panels, preferably model A480SC from Computer Associates Corp.
- the Cyclops product works with several LCD projector models manufactured by Proxima (with built-in CCD and signal processing hardware) such as the 2710 model. These projection devices produce a beam of light that has relatively weak intensity. LCD panels, a now outdated technology, work in conjunction with overhead projectors, which are also typically very weak. Because they are so weak, when employing such devices users typically dim the room lighting or turn it off completely so that details may be discerned in the projected images. Current projector technology produce beams of light that are many times brighter. This allows them to be used comfortably in well-lit rooms.
- patent 5,504,501 discloses use of a "band pass filter disposed over the lens of the device" (column 4, line 30). This filter "limits the range of wavelengths of light permitted to be sensed by the device.” In effect, the filter reduces the chances that other light sources will confuse the device since it is tuned to the (typically red) light of the light-generating device in use.
- the disclosed inventions and the workaround solutions they employ reduce the - possibility of using the components of the invention for other purposes.
- the image sensor of these inventions cannot be utilized for other applications.
- the image sensor of the current invention may be any conventional video conferencing camera and may be used as such while simultaneously serving this invention.
- a case in point is the way Tegrity's product uses the same image sensor to drive various functions of the product, including image capture, recognition of human touching of "virtual buttons," and other functions.
- Modern projector technology create challenges that cannot be met by the prior art, but rather require more extensive use of the available information to produce consistent results that can be reliably repeated in a wide variety of useful configurations.
- a goal of the invention is to be flexible and inexpensive by utilizing commonly available, "off the shelf hardware components as much as possible. These include:
- An Image Sensor such as a video-conferencing type video camera aimed at the Surface.
- the Tegrity system is sold with an NCK41CV manufactured by Howard Enterprises Inc. of Camarillo, CA (a 450 TV -line, NTSC video camera).
- a frame-grabber card may be required to convert the video information to a digital image inside the computer.
- the Tegrity system uses video capture card C210A from Tekram Technology, Taipei, Taiwan. Conventional cabling connects the image sensor to the computer.
- a computer [13]. This is a conventional personal computer, such as an Intel Pentium series computer.
- a pointing device [14].
- the preferred embodiment of the invention employs a standard laser-pointer modified for the purposes of the invention as shown in Figure 3.
- the "Ultra Infinter" by Quarton Inc. is one of many available models that may be used.
- the modification entails adding a small, black foam ball (30 milimeter diameter) near the tip of the laser-pointer.
- the ball is useful in that it facilitates more accurate recognition of the pointer and provides a basis for determining the direction of pointing.
- Alternative embodiments of the invention may employ other types of pointing devices with or without light-emitting capabilities.
- a projector [15] to project the Computer Display Image onto the Surface is not essential to the operation of the invention, however, the invention will typically be more useful when one is employed. Any conventional LCD, DLP or other type of
- Communications interface [16] connecting the computer to some communications infrastructure. This component is required only when the invention is employed for remote presentations or similar uses. This may comprise, for example, a local area network controller or a modem interface. Tegrity Inc. will market the laser-pointer device and computer-software that implement the preferred embodiment of the invention under the name InterPointer (see the product brochure).
- the spot of light generated by the laser-pen is detected as a "pointing event.”
- the light is moved to another location or when it is turned off are each detected as a unique event.
- rapid sequencing of the light on and off multiple times can be detected as yet another set of events.
- the light spot simulates the left mouse button on a standard computer mouse device. Appearance of the spot simulates pressing of the left mouse button, moving it simulates moving the mouse and disappearance of the spot simulates releasing the mouse button. This enables emulating the common mouse commands "click” and “drag-and-drop” that act upon the location at which the spot appeared. When the light is turned on and off twice in succession the “double-click " command can be emulated.
- the system obtains images from the Image Sensor [12] at a rate of approximately 30 images per second.
- the large amount of information supplied by the Image Sensor typically over 18 million bytes per second
- the need to consume a minimal amount of processing resources dictate a -strategy of analysis that is not monolithic. Rather, the analysis is broken into several phases such that each phase performs more intensive processing on less information than the prior phase. In general, these phases include:
- Each processing phase comprises one or more functional modules as described below.
- An additional module, the "Hot-Spot Tracker,” is executed at a reduced frequency (twice per second).
- the system determines that a pointing event has occurred at a specific position in the Viewed Image it sends a "system event” or activates a predetermined system function using techniques that are specific to the computer operating system in use (for example, the "mouse_event()" in Microsoft Window- ⁇ ).
- the display coordinates of the position at which the event should be activated are also provided. These coordinates are obtained by
- the system operates as a "finite state machine.” This means that at any given time the system will be in one of several “states.” Specifically, the states currently used are “off,” “suspected,” and “on.” Each state defines what analysis will be performed on the next image as well as the possible states that the system may enter as a result of this analysis. Specifically, the Preliminary Screening phase of processing is considered necessary only when the system is "off,” i.e. when no pointing activity has been taking place.
- the system must detect the activation of the pointing device in "real-time,” i.e. fast enough to follow typical human activity. In the preferred embodiment of the invention this involves detecting the spot of light produced by a laser-pointer device and determining if it has been turned on and off once or even twice. The latter case would typically be useful for supporting the simulation of mouse "double-click” events. This requires the system to process a large amount of Method and Apparatus for Visual Pointing and Computer Control
- Figure 4(a) shows a pixel image (without color) of a region in which the laser device [41] is activated.
- the foam ball used by this embodiment of the invention can be seen [42] as can the hand of the user [43].
- the spot of light that the laser device produces is readily visible [44].
- Preliminary screening works in the following manner.
- the Viewed Image is scanned for pixels that have an intensity value above a near-white threshold. Scanning all pixels is time consuming, therefore the process skips over most pixels, sampling only 1 out of every 16 pixels. This may be done without missing light spots because the minimal size of the spot is 4x4 pixels for all practical applications (using a 640x480 pixel Viewed Image).
- Figure 4(b) and 4(c) demonstrate how this process operates on the image region of Figure 4(a).
- each black dot represents a pixel coordinate position at which the screening process will sample the value of the corresponding pixel from the image of Figure 4(a).
- This selective sampling produces the image of Figure 4(c).
- This image contains 1/16 th of the information contained in the image of Figure 4(a) however the spot of laser light is still readily apparent at [46].
- the process of preliminary screening ignores certain image-pixels and does not add them to suspected regions even if they exceed the intensity threshold.
- the set of ignored pixels is determined by the Hot-Spot Tracking module as described below. If no suspected regions are found, processing is complete for this cycle and the system may change its state accordingly.
- This module recognizes a pattern of a spot of laser-light hitting the Surface. It can detect any kind of lasers (635nm to 650nm wavelengths). Surprisingly a laser-dot as captured by a typical video camera is not red. Actually it appears as a white spot due to the high intensity of the laser.
- This module operates on predefined regions of interest as follows: It thresholds the image with a high threshold, keeping only pixels of high intensity. The binarized image is then segmented into connected blobs. Then the algorithm looks for a blob that best matches the pattern of an expected spot of light in its dimensions, measure of roundness. These steps use techniques that are well known in the art. The end result is that either no blobs are determined to match the expected characteristics of a light spot or a single blob is selected. In the former case processing is complete with negative results (no activation) and the system may change state accordingly.
- Verification is necessary to prevent false triggering caused by reflective objects like watches and jewelry or by saturation of small regions of the Viewed Image due to the Image Sensor's dynamic-range. The latter is commonly seen when presenting the Image Sensor with a display containing high contrast (typically with Automatic Gain Compensation settings used by most video cameras).
- a major drawback of similar devices introduced in the past was the lack of such verification procedures and relying solely on an intensity threshold to conclude whether or not pointing was activated.
- Fiducial-based Verification This method is used to recognize an a priori known figure or pattern known as a "fiducial".
- the fiducial is the black foam ball attached to the laser-pointer device as shown in Figure 3.
- the ball is detected as follows. A small region of interest in the Viewed Image (about 60x60 pixels) surrounding the detected spot is analyzed. In this region the algorithm looks for the ball using a simple "correlation" with an idealized ball. If the correlation indeed finds a ball, the image is binarized by its intensity, using a threshold that is determined during the correlation check (utilizing the knowledge that the ball is black). The "blob" that defines the ball's pixels is checked to be round and of fitting dimensions and proportions. Once the system locates the fiducial the direction in which the user is pointing is easily determined.
- Skin-based Verification This method is used to verify that human skin is apparent in the Viewed Image in close proximity to the spot at which light was detected. This is based on the assumption that when activating the pointer the user's hand is near the light spot
- the verification procedure is based on a "skin detector” function, which detects pixels that have a high likelihood of corresponding to human skin.
- the algorithm uses a statistical model of human-skin color and is based on current research techniques in this field (see “Visual Tracking for Multimodal Human Computer Interaction” - Jie Yang, Rainer Stiefelhagen, Uwe Meier, Alex Waibel at CHI98).
- Hot-spots are regions of the image that appear saturated, i.e. where pixels are assigned the -highest possible intensity values (white) due to the relative contrast of that area with other areas of the viewed Surface. For example, the reflection of a projector's beam, which may be blinding to a viewer, will typically produce this effect in a digital image. Similarly, in some cases projecting a highly contrasting image (such as one with a dark background and a small amount of bright pixels) may produce hot-spots. In these areas differences of intensity cannot be distinguished, therefore the light-spot of a laser will not be detected within them. The main problem is that hot-spots may sometimes appear similar in characteristics (size, shape, intensity) to typical light spots, thus generating false triggering of this invention.
- the Hot-Spot Tracking module solves this problem by continuously tracking and "learning" which pixels belong to hot-spots. These pixels are remembered by saving a "mask” which represents locations to be ignored by the Preliminary Screening phase described above.
- the idea is to compute and maintain a running average image of the last few seconds and threshold it to produce a binary mask. Pixels that are steadily saturated will quickly be marked in this mask and will be ignored in
- This module is activated at a low frequency - approximately twice per second. This rate (along with the averaging coefficient used for computing the running average) is quick enough to prevent false triggering by hot-spots yet slow enough to have a negligible impact on performance.
- the Tegrity InterPointer can be used in place of a mouse, to draw elements or make menu selections directly on the white board, and as a pointing device that can be broadcast during WebLearner distance learning.
- the InterPointer Add-on software must be installed, and a serial number entered. This is done through the installation program on the Tegrity Digital Flipchart CD. See your Tegrity Digital Flipchart User's Guide, chapter 4, for installation instructions.
- a checkmark displays beside the option in the panel.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU38949/00A AU3894900A (en) | 1999-03-17 | 2000-03-17 | Method and apparatus for visual pointing and computer control |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12472899P | 1999-03-17 | 1999-03-17 | |
US60/124,728 | 1999-03-17 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2000058933A1 true WO2000058933A1 (en) | 2000-10-05 |
WO2000058933B1 WO2000058933B1 (en) | 2001-01-11 |
Family
ID=22416511
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2000/007118 WO2000058933A1 (en) | 1999-03-17 | 2000-03-17 | Method and apparatus for visual pointing and computer control |
Country Status (2)
Country | Link |
---|---|
AU (1) | AU3894900A (en) |
WO (1) | WO2000058933A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006054207A1 (en) * | 2004-11-16 | 2006-05-26 | Koninklijke Philips Electronics N.V. | Touchless manipulation of images for regional enhancement |
WO2008156457A1 (en) * | 2007-06-20 | 2008-12-24 | Thomson Licensing | Interactive display with camera feedback |
US8276077B2 (en) | 2009-07-10 | 2012-09-25 | The Mcgraw-Hill Companies, Inc. | Method and apparatus for automatic annotation of recorded presentations |
CN104899361A (en) * | 2015-05-19 | 2015-09-09 | 华为技术有限公司 | Remote control method and apparatus |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3885096A (en) * | 1972-07-15 | 1975-05-20 | Fuji Photo Film Co Ltd | Optical display device |
US5181015A (en) * | 1989-11-07 | 1993-01-19 | Proxima Corporation | Method and apparatus for calibrating an optical computer input system |
US5528263A (en) * | 1994-06-15 | 1996-06-18 | Daniel M. Platzker | Interactive projected video image display system |
US5712658A (en) * | 1993-12-28 | 1998-01-27 | Hitachi, Ltd. | Information presentation apparatus and information display apparatus |
US5793361A (en) * | 1994-06-09 | 1998-08-11 | Corporation For National Research Initiatives | Unconstrained pointing interface for natural human interaction with a display-based computer system |
-
2000
- 2000-03-17 AU AU38949/00A patent/AU3894900A/en not_active Abandoned
- 2000-03-17 WO PCT/US2000/007118 patent/WO2000058933A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3885096A (en) * | 1972-07-15 | 1975-05-20 | Fuji Photo Film Co Ltd | Optical display device |
US5181015A (en) * | 1989-11-07 | 1993-01-19 | Proxima Corporation | Method and apparatus for calibrating an optical computer input system |
US5712658A (en) * | 1993-12-28 | 1998-01-27 | Hitachi, Ltd. | Information presentation apparatus and information display apparatus |
US5793361A (en) * | 1994-06-09 | 1998-08-11 | Corporation For National Research Initiatives | Unconstrained pointing interface for natural human interaction with a display-based computer system |
US5528263A (en) * | 1994-06-15 | 1996-06-18 | Daniel M. Platzker | Interactive projected video image display system |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006054207A1 (en) * | 2004-11-16 | 2006-05-26 | Koninklijke Philips Electronics N.V. | Touchless manipulation of images for regional enhancement |
US8473869B2 (en) | 2004-11-16 | 2013-06-25 | Koninklijke Philips Electronics N.V. | Touchless manipulation of images for regional enhancement |
WO2008156457A1 (en) * | 2007-06-20 | 2008-12-24 | Thomson Licensing | Interactive display with camera feedback |
US8276077B2 (en) | 2009-07-10 | 2012-09-25 | The Mcgraw-Hill Companies, Inc. | Method and apparatus for automatic annotation of recorded presentations |
CN104899361A (en) * | 2015-05-19 | 2015-09-09 | 华为技术有限公司 | Remote control method and apparatus |
EP3096489A1 (en) * | 2015-05-19 | 2016-11-23 | Huawei Technologies Co., Ltd. | Remote control method and apparatus |
US9785266B2 (en) | 2015-05-19 | 2017-10-10 | Huawei Technologies Co., Ltd. | Remote control method and apparatus |
CN104899361B (en) * | 2015-05-19 | 2018-01-16 | 华为技术有限公司 | A kind of remote control method and device |
Also Published As
Publication number | Publication date |
---|---|
WO2000058933B1 (en) | 2001-01-11 |
AU3894900A (en) | 2000-10-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kirstein et al. | Interaction with a projection screen using a camera-tracked laser pointer | |
JP4323180B2 (en) | Interface method, apparatus, and program using self-image display | |
US8818027B2 (en) | Computing device interface | |
EP0771460B1 (en) | Interactive projected video image display system | |
EP2049979B1 (en) | Multi-user pointing apparatus and method | |
US10015402B2 (en) | Electronic apparatus | |
US6414672B2 (en) | Information input apparatus | |
JP3419050B2 (en) | Input device | |
EP0686935A1 (en) | Pointing interface | |
US20010030668A1 (en) | Method and system for interacting with a display | |
WO2019033957A1 (en) | Interaction position determination method and system, storage medium and smart terminal | |
US20140247216A1 (en) | Trigger and control method and system of human-computer interaction operation command and laser emission device | |
US20140145941A1 (en) | Computer vision gesture based control of a device | |
KR20040063153A (en) | Method and apparatus for a gesture-based user interface | |
US20090115971A1 (en) | Dual-mode projection apparatus and method for locating a light spot in a projected image | |
Cavens et al. | Interacting with the big screen: pointers to ponder | |
US20140053115A1 (en) | Computer vision gesture based control of a device | |
WO2000058933A1 (en) | Method and apparatus for visual pointing and computer control | |
JP6834197B2 (en) | Information processing equipment, display system, program | |
US20170357336A1 (en) | Remote computer mouse by camera and laser pointer | |
CN110620955A (en) | Live broadcasting system and live broadcasting method thereof | |
Kim et al. | Multi-touch interaction for table-top display | |
CN112462939A (en) | Interactive projection method and system | |
Kim et al. | New interface using palm and fingertip without marker for ubiquitous environment | |
WO2001046941A1 (en) | Method and apparatus for vision-based coupling between pointer actions and projected images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
AK | Designated states |
Kind code of ref document: B1 Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: B1 Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
B | Later publication of amended claims | ||
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 09936866 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase | ||
NENP | Non-entry into the national phase |
Ref country code: JP |