WO2000058933A1 - Method and apparatus for visual pointing and computer control - Google Patents

Method and apparatus for visual pointing and computer control Download PDF

Info

Publication number
WO2000058933A1
WO2000058933A1 PCT/US2000/007118 US0007118W WO0058933A1 WO 2000058933 A1 WO2000058933 A1 WO 2000058933A1 US 0007118 W US0007118 W US 0007118W WO 0058933 A1 WO0058933 A1 WO 0058933A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pointing
computer
light
information
Prior art date
Application number
PCT/US2000/007118
Other languages
French (fr)
Other versions
WO2000058933B1 (en
Inventor
Daniel M. Platzker
Segal Tsaki
Biran Liel
Gabriel Berelejis
Original Assignee
Tegrity, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tegrity, Inc. filed Critical Tegrity, Inc.
Priority to AU38949/00A priority Critical patent/AU3894900A/en
Publication of WO2000058933A1 publication Critical patent/WO2000058933A1/en
Publication of WO2000058933B1 publication Critical patent/WO2000058933B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Definitions

  • the present invention relates generally to the fields of processing images, pointing devices, computer-control devices, and remote presentation applications.
  • Computer Display Image The display presented by a computer for human viewing on a device such as a monitor or Surface employing technologies such as VGA,
  • Image Sensor An optical device such as a video camera, digital camera or other imaging technology capable of viewing a Surface.
  • Warping A transformation performed on an image based on a mapping between two geometrical coordinate spaces; in the present invention Viewed Images (or portions thereof) are transformed in this manner to a predefined display coordinate space and projected/displayed images (or portions thereof) are transformed to the coordinate space of the Viewed Images (using both Warping and optional "scaling" to overcome differences in pixel-resolution); the geometric mapping is obtained through a process of calibration.
  • the present invention is an apparatus and method for controlling the display, transfer and processing of information in computer systems by means of pointing.
  • a user of the invention points at a surface such as a wall or whiteboard, which contains projected or written information in order to activate predetermined operations, related to that information. These operations may include, for example, those available with a commonplace computer-mouse device as well as many others.
  • the typical configuration of components for the invention is depicted in Figure 1.
  • the invention employs an image-sensing device [12] such as a video camera to view a surface [11], which may be used simultaneously for projecting the computer display.
  • the user introduces visual input into the system by pointing, typically with a pointing device [14] such as a laser-pointer at the desired spot within the viewed area of the surface.
  • a stream of digital images from the image-sensor [12] is
  • the appearance of the visual input is detected and its precise location on the viewed surface is determined.
  • the appearance (or motion or disappearance) of the visual input along with its location are interpreted in order to perform the operation that was intended by the user.
  • the system may also determine the direction from which the user pointed. Directional information may be of use in some applications as explained below.
  • FIG. 1 A remote pointing device
  • FIG. 2 An example of this is shown in Figure 2.
  • Site A is configured as shown in Figure 1, including means of network communication [16] (to a local network and/or Internet, for example).
  • An instructor at site A uses a surface, which contains written and/or projected information.
  • Sites B and C show typical "student" configurations containing an ordinary computer system equipped with communication (e.g. modem) and display (e.g. VGA) capabilities.
  • the written and/or projected information at site A may be transferred to the remote participants and displayed at sites B and C using other technologies such as the Tegrit WebLeamer
  • the invention causes the display of a "cursor" image, such as an arrow, at the appropriate spot in the display at sites B and C.
  • a cursor such as an arrow
  • the displayed cursor moves accordingly and when the instructor stops pointing the cursor disappears.
  • the direction of pointing is determined. This enables displaying different cursors.
  • the arrow shown at sites B and C may be displayed at an angle that approximately corresponds to the angle at which the instructor is holding the pointing device.
  • Multi-modal interaction with information In conjunction with other technologies such as speech-recognition, the invention may be used in order to manipulate information objects. For example, the user may point her finger at a displayed icon that represents a data-file while saying "open this," and the system would react by displaying the
  • the current invention provides the ability to identify the object to be acted upon and the impetus to do so, while another technology determines which specific action should be performed.
  • This apparatus emulates computer-mouse operation using a hand-held light-generating device.
  • the invention was designed for use with liquid crystal display (LCD) panels and some specific models of LCD projectors manufactured by Proxima Corp. It is a hardware solution that integrates various interconnected components, including a CCD image sensor, timing generator, analog comparator, digital-to-analog converter, a preprogrammed microprocessor, cabling and other components into a single unit that outputs mouse control commands to an attached computer.
  • LCD liquid crystal display
  • the best mode for carrying out the Proxima invention uses a signal-processing unit that comprises an analog to digital (A/D) comparator network, a synchronizing network and a position detector.
  • the A D comparator network comprises a conventional video amplifier (emitter), analog comparator LM319A and digital to analog converter AD557.
  • the synchronizing network is built from a Sanyo LC9904A timing generator, a Sanyo LB8904 high voltage drive unit and a comentional crystal resonator.
  • the position detector comprises an Intel 8051 microprocessor, several logic devices including counters, inverters, and latches (74HC393, 74HC374, 74HC14) and connectors and cabling interfaces to a computer.
  • the signal processing unit produces signals that control the image sensing operation of a Sanyo LC9943 charge coupled device (CCD) (see column 4, line 65 through column 5, line 9).
  • CCD charge coupled device
  • the CCD scans the scene it passes each pixel's intensity value to the signal-processing unit for analysis.
  • the latter determines on a pixel-by-pixel basis whether the pixel can be associated with a viewed light spot produced by the light-generating device (preferably a "Pocket Laser” sold by Navitar, see column 4, line 31). This determination is made solely based on the intensity of the pixel relative to a predetermined threshold value.
  • the current invention uses off-the-shelf video camera and frame-grabber technology (required for some cameras) to reproduce full-frame digital video images inside the computer's RAM (random access memory).
  • the innovation of the invention is in the software that performs the image processing analysis. Rather than operating on a sequential scan of pixels one at a time the invention filters the entire digital image or selected regions of the image at once by directly accessing any desired subset of pixel values which are directly accessible from the memory buffer that stores the image with all pertinent image information (including pixel intensities and color) of a "snapshot," or frame, of the visual scene.. See Figure 4 for an example of such an image region (color information is not shown). Some information is retained over time, i.e.
  • the software may also utilize additional, related information available from within the computer system. For example, the pixel values of the currently projected Computer Display Image are easily obtained for analysis and comparison to the sensed values of the Viewed Image.
  • the current invention uses much more information (spatial, temporal and color) that are filtered by a large number of processing methods in order to determine what event, if any, is taking place.
  • the prior art provides a subset of the functionality described herein (specifically mouse emulation) it does so with significant restrictions. Specifically, using less image information (i.e., intensity values only and/or one pixel at a time) implies, in general, reduced reliability. It is easy to produce scenarios in which the inventions described in the prior art will either not respond to the light stimulus as expected or will induce false triggering by generating events that were not intended by the user. Especially, the prior art can be expected to be reliable (with a small rate of errors) only -when used in certain environments and configurations. For example, the Proxima inventions work with LCD panels, preferably model A480SC from Computer Associates Corp.
  • the Cyclops product works with several LCD projector models manufactured by Proxima (with built-in CCD and signal processing hardware) such as the 2710 model. These projection devices produce a beam of light that has relatively weak intensity. LCD panels, a now outdated technology, work in conjunction with overhead projectors, which are also typically very weak. Because they are so weak, when employing such devices users typically dim the room lighting or turn it off completely so that details may be discerned in the projected images. Current projector technology produce beams of light that are many times brighter. This allows them to be used comfortably in well-lit rooms.
  • patent 5,504,501 discloses use of a "band pass filter disposed over the lens of the device" (column 4, line 30). This filter "limits the range of wavelengths of light permitted to be sensed by the device.” In effect, the filter reduces the chances that other light sources will confuse the device since it is tuned to the (typically red) light of the light-generating device in use.
  • the disclosed inventions and the workaround solutions they employ reduce the - possibility of using the components of the invention for other purposes.
  • the image sensor of these inventions cannot be utilized for other applications.
  • the image sensor of the current invention may be any conventional video conferencing camera and may be used as such while simultaneously serving this invention.
  • a case in point is the way Tegrity's product uses the same image sensor to drive various functions of the product, including image capture, recognition of human touching of "virtual buttons," and other functions.
  • Modern projector technology create challenges that cannot be met by the prior art, but rather require more extensive use of the available information to produce consistent results that can be reliably repeated in a wide variety of useful configurations.
  • a goal of the invention is to be flexible and inexpensive by utilizing commonly available, "off the shelf hardware components as much as possible. These include:
  • An Image Sensor such as a video-conferencing type video camera aimed at the Surface.
  • the Tegrity system is sold with an NCK41CV manufactured by Howard Enterprises Inc. of Camarillo, CA (a 450 TV -line, NTSC video camera).
  • a frame-grabber card may be required to convert the video information to a digital image inside the computer.
  • the Tegrity system uses video capture card C210A from Tekram Technology, Taipei, Taiwan. Conventional cabling connects the image sensor to the computer.
  • a computer [13]. This is a conventional personal computer, such as an Intel Pentium series computer.
  • a pointing device [14].
  • the preferred embodiment of the invention employs a standard laser-pointer modified for the purposes of the invention as shown in Figure 3.
  • the "Ultra Infinter" by Quarton Inc. is one of many available models that may be used.
  • the modification entails adding a small, black foam ball (30 milimeter diameter) near the tip of the laser-pointer.
  • the ball is useful in that it facilitates more accurate recognition of the pointer and provides a basis for determining the direction of pointing.
  • Alternative embodiments of the invention may employ other types of pointing devices with or without light-emitting capabilities.
  • a projector [15] to project the Computer Display Image onto the Surface is not essential to the operation of the invention, however, the invention will typically be more useful when one is employed. Any conventional LCD, DLP or other type of
  • Communications interface [16] connecting the computer to some communications infrastructure. This component is required only when the invention is employed for remote presentations or similar uses. This may comprise, for example, a local area network controller or a modem interface. Tegrity Inc. will market the laser-pointer device and computer-software that implement the preferred embodiment of the invention under the name InterPointer (see the product brochure).
  • the spot of light generated by the laser-pen is detected as a "pointing event.”
  • the light is moved to another location or when it is turned off are each detected as a unique event.
  • rapid sequencing of the light on and off multiple times can be detected as yet another set of events.
  • the light spot simulates the left mouse button on a standard computer mouse device. Appearance of the spot simulates pressing of the left mouse button, moving it simulates moving the mouse and disappearance of the spot simulates releasing the mouse button. This enables emulating the common mouse commands "click” and “drag-and-drop” that act upon the location at which the spot appeared. When the light is turned on and off twice in succession the “double-click " command can be emulated.
  • the system obtains images from the Image Sensor [12] at a rate of approximately 30 images per second.
  • the large amount of information supplied by the Image Sensor typically over 18 million bytes per second
  • the need to consume a minimal amount of processing resources dictate a -strategy of analysis that is not monolithic. Rather, the analysis is broken into several phases such that each phase performs more intensive processing on less information than the prior phase. In general, these phases include:
  • Each processing phase comprises one or more functional modules as described below.
  • An additional module, the "Hot-Spot Tracker,” is executed at a reduced frequency (twice per second).
  • the system determines that a pointing event has occurred at a specific position in the Viewed Image it sends a "system event” or activates a predetermined system function using techniques that are specific to the computer operating system in use (for example, the "mouse_event()" in Microsoft Window- ⁇ ).
  • the display coordinates of the position at which the event should be activated are also provided. These coordinates are obtained by
  • the system operates as a "finite state machine.” This means that at any given time the system will be in one of several “states.” Specifically, the states currently used are “off,” “suspected,” and “on.” Each state defines what analysis will be performed on the next image as well as the possible states that the system may enter as a result of this analysis. Specifically, the Preliminary Screening phase of processing is considered necessary only when the system is "off,” i.e. when no pointing activity has been taking place.
  • the system must detect the activation of the pointing device in "real-time,” i.e. fast enough to follow typical human activity. In the preferred embodiment of the invention this involves detecting the spot of light produced by a laser-pointer device and determining if it has been turned on and off once or even twice. The latter case would typically be useful for supporting the simulation of mouse "double-click” events. This requires the system to process a large amount of Method and Apparatus for Visual Pointing and Computer Control
  • Figure 4(a) shows a pixel image (without color) of a region in which the laser device [41] is activated.
  • the foam ball used by this embodiment of the invention can be seen [42] as can the hand of the user [43].
  • the spot of light that the laser device produces is readily visible [44].
  • Preliminary screening works in the following manner.
  • the Viewed Image is scanned for pixels that have an intensity value above a near-white threshold. Scanning all pixels is time consuming, therefore the process skips over most pixels, sampling only 1 out of every 16 pixels. This may be done without missing light spots because the minimal size of the spot is 4x4 pixels for all practical applications (using a 640x480 pixel Viewed Image).
  • Figure 4(b) and 4(c) demonstrate how this process operates on the image region of Figure 4(a).
  • each black dot represents a pixel coordinate position at which the screening process will sample the value of the corresponding pixel from the image of Figure 4(a).
  • This selective sampling produces the image of Figure 4(c).
  • This image contains 1/16 th of the information contained in the image of Figure 4(a) however the spot of laser light is still readily apparent at [46].
  • the process of preliminary screening ignores certain image-pixels and does not add them to suspected regions even if they exceed the intensity threshold.
  • the set of ignored pixels is determined by the Hot-Spot Tracking module as described below. If no suspected regions are found, processing is complete for this cycle and the system may change its state accordingly.
  • This module recognizes a pattern of a spot of laser-light hitting the Surface. It can detect any kind of lasers (635nm to 650nm wavelengths). Surprisingly a laser-dot as captured by a typical video camera is not red. Actually it appears as a white spot due to the high intensity of the laser.
  • This module operates on predefined regions of interest as follows: It thresholds the image with a high threshold, keeping only pixels of high intensity. The binarized image is then segmented into connected blobs. Then the algorithm looks for a blob that best matches the pattern of an expected spot of light in its dimensions, measure of roundness. These steps use techniques that are well known in the art. The end result is that either no blobs are determined to match the expected characteristics of a light spot or a single blob is selected. In the former case processing is complete with negative results (no activation) and the system may change state accordingly.
  • Verification is necessary to prevent false triggering caused by reflective objects like watches and jewelry or by saturation of small regions of the Viewed Image due to the Image Sensor's dynamic-range. The latter is commonly seen when presenting the Image Sensor with a display containing high contrast (typically with Automatic Gain Compensation settings used by most video cameras).
  • a major drawback of similar devices introduced in the past was the lack of such verification procedures and relying solely on an intensity threshold to conclude whether or not pointing was activated.
  • Fiducial-based Verification This method is used to recognize an a priori known figure or pattern known as a "fiducial".
  • the fiducial is the black foam ball attached to the laser-pointer device as shown in Figure 3.
  • the ball is detected as follows. A small region of interest in the Viewed Image (about 60x60 pixels) surrounding the detected spot is analyzed. In this region the algorithm looks for the ball using a simple "correlation" with an idealized ball. If the correlation indeed finds a ball, the image is binarized by its intensity, using a threshold that is determined during the correlation check (utilizing the knowledge that the ball is black). The "blob" that defines the ball's pixels is checked to be round and of fitting dimensions and proportions. Once the system locates the fiducial the direction in which the user is pointing is easily determined.
  • Skin-based Verification This method is used to verify that human skin is apparent in the Viewed Image in close proximity to the spot at which light was detected. This is based on the assumption that when activating the pointer the user's hand is near the light spot
  • the verification procedure is based on a "skin detector” function, which detects pixels that have a high likelihood of corresponding to human skin.
  • the algorithm uses a statistical model of human-skin color and is based on current research techniques in this field (see “Visual Tracking for Multimodal Human Computer Interaction” - Jie Yang, Rainer Stiefelhagen, Uwe Meier, Alex Waibel at CHI98).
  • Hot-spots are regions of the image that appear saturated, i.e. where pixels are assigned the -highest possible intensity values (white) due to the relative contrast of that area with other areas of the viewed Surface. For example, the reflection of a projector's beam, which may be blinding to a viewer, will typically produce this effect in a digital image. Similarly, in some cases projecting a highly contrasting image (such as one with a dark background and a small amount of bright pixels) may produce hot-spots. In these areas differences of intensity cannot be distinguished, therefore the light-spot of a laser will not be detected within them. The main problem is that hot-spots may sometimes appear similar in characteristics (size, shape, intensity) to typical light spots, thus generating false triggering of this invention.
  • the Hot-Spot Tracking module solves this problem by continuously tracking and "learning" which pixels belong to hot-spots. These pixels are remembered by saving a "mask” which represents locations to be ignored by the Preliminary Screening phase described above.
  • the idea is to compute and maintain a running average image of the last few seconds and threshold it to produce a binary mask. Pixels that are steadily saturated will quickly be marked in this mask and will be ignored in
  • This module is activated at a low frequency - approximately twice per second. This rate (along with the averaging coefficient used for computing the running average) is quick enough to prevent false triggering by hot-spots yet slow enough to have a negligible impact on performance.
  • the Tegrity InterPointer can be used in place of a mouse, to draw elements or make menu selections directly on the white board, and as a pointing device that can be broadcast during WebLearner distance learning.
  • the InterPointer Add-on software must be installed, and a serial number entered. This is done through the installation program on the Tegrity Digital Flipchart CD. See your Tegrity Digital Flipchart User's Guide, chapter 4, for installation instructions.
  • a checkmark displays beside the option in the panel.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An apparatus and method for controlling the display, transfer and processing of information in computer systems (13), wherein a user aims a pointer (14) at a surface (11), such as a wall or whiteboard, which contains projected or written information, in order to activate predetermined operations, related to that information. A stream of digital images from the image sensor (12) is analyzed by the computer (13), and the appearance of the visual input along with its location are interpreted in order to perform the operation that was intended by the user.

Description

WO 00/58933 PCTtUSOO/07118
Method and Apparatus for Visual Pointing and Computer Control
FIELD OF THE INVENTION
The present invention relates generally to the fields of processing images, pointing devices, computer-control devices, and remote presentation applications.
INCORPORATION BY REFERENCE
This application incorporates by reference the "Interactive Projected Video Image Display System" disclosed under United States Patent 5,528,263 (Platzker et al.) as if set forth at length herein. It also incorporates by reference the "Method and Apparatus for Processing, Displaying and Communicating Images" disclosed in a non-provisional application filed October 2, 1998 and assigned serial # 09/166,211 pursuant to a provisional application under the title "Remote Virtual Whiteboard," filed October 3, 1997 and assigned Serial No. 60/060942.
DEFINITIONS
Computer Display Image The display presented by a computer for human viewing on a device such as a monitor or Surface employing technologies such as VGA,
SVGA, XGA and others.
Image Sensor An optical device such as a video camera, digital camera or other imaging technology capable of viewing a Surface.
Surface A flat surface upon which the Computer Display Image created by a computer-controlled projection may appear and/or upon which visible information may be written or placed. Method and Apparatus for Visual Pointing and Computer Control
Viewed Image The image acquired (or "seen") by the Image Sensor and made available as digital information to computational resources (software/hard ware) .
Warping A transformation performed on an image based on a mapping between two geometrical coordinate spaces; in the present invention Viewed Images (or portions thereof) are transformed in this manner to a predefined display coordinate space and projected/displayed images (or portions thereof) are transformed to the coordinate space of the Viewed Images (using both Warping and optional "scaling" to overcome differences in pixel-resolution); the geometric mapping is obtained through a process of calibration.
SUMMARY OF THE INVENTION
The present invention is an apparatus and method for controlling the display, transfer and processing of information in computer systems by means of pointing. A user of the invention points at a surface such as a wall or whiteboard, which contains projected or written information in order to activate predetermined operations, related to that information. These operations may include, for example, those available with a commonplace computer-mouse device as well as many others. The typical configuration of components for the invention is depicted in Figure 1. The invention employs an image-sensing device [12] such as a video camera to view a surface [11], which may be used simultaneously for projecting the computer display. The user introduces visual input into the system by pointing, typically with a pointing device [14] such as a laser-pointer at the desired spot within the viewed area of the surface. A stream of digital images from the image-sensor [12] is
Method and Apparatus for Visual Pointing and Computer Control
analyzed within the computer [13]. The appearance of the visual input is detected and its precise location on the viewed surface is determined. The appearance (or motion or disappearance) of the visual input along with its location are interpreted in order to perform the operation that was intended by the user. Depending on the type of pointing device the system may also determine the direction from which the user pointed. Directional information may be of use in some applications as explained below.
Examples of how this invention may be used in practice include:
1. As a computer-mouse substitute using a light-generating pointer: If the computer display is projected onto the surface, the user may point the pointing-device at a location of choice and activate mouse functions by turning the light from the light-generation pen on and off. Turning the light on simulates pressing down on the mouse key while turning the light off simulates releasing the mouse key. This provides a simple means of generating the basic mouse commands: click, drag, drop and double-click.
2. As a remote pointing device: In remote presentations or distance education the device may be employed in order to display to participants at remote sites where a presenter is currently pointing to on a surface. An example of this is shown in Figure 2. Site A is configured as shown in Figure 1, including means of network communication [16] (to a local network and/or Internet, for example). An instructor at site A uses a surface, which contains written and/or projected information. Sites B and C show typical "student" configurations containing an ordinary computer system equipped with communication (e.g. modem) and display (e.g. VGA) capabilities. The written and/or projected information at site A may be transferred to the remote participants and displayed at sites B and C using other technologies such as the Tegrit WebLeamer
Method and Apparatus for Visual Pointing and Computer Control
system (offered by Tegrity Inc. of San Jose, CA as part of the Digital Flipchart product). When the instructor points somewhere on the surface, for example at the apple, the invention causes the display of a "cursor" image, such as an arrow, at the appropriate spot in the display at sites B and C. When the instructor moves the pointing device, the displayed cursor moves accordingly and when the instructor stops pointing the cursor disappears. With certain types of pointing devices the direction of pointing is determined. This enables displaying different cursors. For example, the arrow shown at sites B and C may be displayed at an angle that approximately corresponds to the angle at which the instructor is holding the pointing device.
3. As a navigation and control user-interface: Many computer-game systems and VRML "worlds" are controlled by sequences of command typically entered from a keyboard, mouse or joystick. These commands control various functions of the game as well as navigation throughout some "virtual reality" displayed to the user. The current invention may be used in this context by mapping some of these functions to the pointing interface described herein. For example, when the user points at a particular item in the displayed image the appropriate navigation command can be generated as if produced by the keyboard. Various sequences combining activation/deactivation of pointing as well as motion and direction make it possible to simulate a variety of navigation and control commands.
4. Multi-modal interaction with information: In conjunction with other technologies such as speech-recognition, the invention may be used in order to manipulate information objects. For example, the user may point her finger at a displayed icon that represents a data-file while saying "open this," and the system would react by displaying the
Method anά'Apparatus for Visual Pointing and Computer Control
contents of the file in a word-processing or editing program. In this example, the current invention provides the ability to identify the object to be acted upon and the impetus to do so, while another technology determines which specific action should be performed.
The prior art includes the "Cyclops Interactive Pointer" system sold by Proxima Corp., San
Diego, CA, that is based on inventions disclosed under several United States Patents: 5,504,501
(Hauck et. al.), 5,515,079 (Hauck), and 5,502,459 (Marshall et. al). This apparatus emulates computer-mouse operation using a hand-held light-generating device. The invention was designed for use with liquid crystal display (LCD) panels and some specific models of LCD projectors manufactured by Proxima Corp. It is a hardware solution that integrates various interconnected components, including a CCD image sensor, timing generator, analog comparator, digital-to-analog converter, a preprogrammed microprocessor, cabling and other components into a single unit that outputs mouse control commands to an attached computer.
As detailed in patent 5,515,079 (columns 7 through 10) the best mode for carrying out the Proxima invention uses a signal-processing unit that comprises an analog to digital (A/D) comparator network, a synchronizing network and a position detector. The A D comparator network comprises a conventional video amplifier (emitter), analog comparator LM319A and digital to analog converter AD557. The synchronizing network is built from a Sanyo LC9904A timing generator, a Sanyo LB8904 high voltage drive unit and a comentional crystal resonator. The position detector comprises an Intel 8051 microprocessor, several logic devices including counters, inverters, and latches (74HC393, 74HC374, 74HC14) and connectors and cabling interfaces to a computer. The signal processing unit produces signals that control the image sensing operation of a Sanyo LC9943 charge coupled device (CCD) (see column 4, line 65 through column 5, line 9).
Method and Apparatus for Visual Pointing and Computer Control
As the CCD scans the scene it passes each pixel's intensity value to the signal-processing unit for analysis. The latter determines on a pixel-by-pixel basis whether the pixel can be associated with a viewed light spot produced by the light-generating device (preferably a "Pocket Laser" sold by Navitar, see column 4, line 31). This determination is made solely based on the intensity of the pixel relative to a predetermined threshold value.
In contrast to the above description the current invention uses off-the-shelf video camera and frame-grabber technology (required for some cameras) to reproduce full-frame digital video images inside the computer's RAM (random access memory). The innovation of the invention is in the software that performs the image processing analysis. Rather than operating on a sequential scan of pixels one at a time the invention filters the entire digital image or selected regions of the image at once by directly accessing any desired subset of pixel values which are directly accessible from the memory buffer that stores the image with all pertinent image information (including pixel intensities and color) of a "snapshot," or frame, of the visual scene.. See Figure 4 for an example of such an image region (color information is not shown). Some information is retained over time, i.e. during multiple frames (in some cases using more abstracted representations) in additional memory buffers. The software may also utilize additional, related information available from within the computer system. For example, the pixel values of the currently projected Computer Display Image are easily obtained for analysis and comparison to the sensed values of the Viewed Image.
These structural differences from the prior art allow for much greater flexibility and power in the steps of analysis implemented. For example, a variety of pattern recognition algorithms are used to analyze and classify viewed shapes. Color values are used to help discriminate between the image of a targeted object or a human hand and the image background. Motion can be detected and isolated by analyzing information retained from a sequence of image frames. Thus, rather than
Method and Apparatus for Visual Pointing and Computer Control
basing the detection of observed events solely on intensity thresholds, the current invention uses much more information (spatial, temporal and color) that are filtered by a large number of processing methods in order to determine what event, if any, is taking place.
An important point in this regard relates to the robustness and reliability of the invention. Whereas the prior art provides a subset of the functionality described herein (specifically mouse emulation) it does so with significant restrictions. Specifically, using less image information (i.e., intensity values only and/or one pixel at a time) implies, in general, reduced reliability. It is easy to produce scenarios in which the inventions described in the prior art will either not respond to the light stimulus as expected or will induce false triggering by generating events that were not intended by the user. Apparently, the prior art can be expected to be reliable (with a small rate of errors) only -when used in certain environments and configurations. For example, the Proxima inventions work with LCD panels, preferably model A480SC from Computer Associates Corp. (column 4, line 57). The Cyclops product works with several LCD projector models manufactured by Proxima (with built-in CCD and signal processing hardware) such as the 2710 model. These projection devices produce a beam of light that has relatively weak intensity. LCD panels, a now outdated technology, work in conjunction with overhead projectors, which are also typically very weak. Because they are so weak, when employing such devices users typically dim the room lighting or turn it off completely so that details may be discerned in the projected images. Current projector technology produce beams of light that are many times brighter. This allows them to be used comfortably in well-lit rooms.
The significantly higher intensity of projected light and the addition of ambient room light and its reflections is incompatible with the restrictions of the prior art devices. Under these conditions it becomes impossible, in general, to determine threshold intensity values to discriminate
Method and Apparatus for Visual Pointing and Computer Control
spots of light because many areas of the projected image may exceed any preset threshold. Similarly, reflections from jewelry, belt-buckles or other objects may also produce transient spots of high intensity that could be falsely detected as mouse events. These problems can be partly oived in some cases by special adaptations made to the image-sensing device. For example, patent 5,504,501 discloses use of a "band pass filter disposed over the lens of the device" (column 4, line 30). This filter "limits the range of wavelengths of light permitted to be sensed by the device." In effect, the filter reduces the chances that other light sources will confuse the device since it is tuned to the (typically red) light of the light-generating device in use. This and similar solutions improve the robustness of the prior art devices, but they cannot completely overcome the problems described above. Furthermore, the disclosed inventions and the workaround solutions they employ reduce the - possibility of using the components of the invention for other purposes. Specifically, the image sensor of these inventions cannot be utilized for other applications. In contrast, the image sensor of the current invention may be any conventional video conferencing camera and may be used as such while simultaneously serving this invention. A case in point is the way Tegrity's product uses the same image sensor to drive various functions of the product, including image capture, recognition of human touching of "virtual buttons," and other functions. Modern projector technology create challenges that cannot be met by the prior art, but rather require more extensive use of the available information to produce consistent results that can be reliably repeated in a wide variety of useful configurations.
DFTATT ED DESCRIPTION OF THE PREFERRED EMBODIMENT OF THE INVENTION
Components
Method and Apparatus for Visual Pointing and Computer Control
A goal of the invention is to be flexible and inexpensive by utilizing commonly available, "off the shelf hardware components as much as possible. These include:
1. A whiteboard, screen or other Surface [11].
2. An Image Sensor [12] such as a video-conferencing type video camera aimed at the Surface. For example, the Tegrity system is sold with an NCK41CV manufactured by Howard Enterprises Inc. of Camarillo, CA (a 450 TV -line, NTSC video camera). A frame-grabber card may be required to convert the video information to a digital image inside the computer. For example, the Tegrity system uses video capture card C210A from Tekram Technology, Taipei, Taiwan. Conventional cabling connects the image sensor to the computer.
3. A computer [13]. This is a conventional personal computer, such as an Intel Pentium series computer.
4. A pointing device [14]. The preferred embodiment of the invention employs a standard laser-pointer modified for the purposes of the invention as shown in Figure 3. For example the "Ultra Infinter" by Quarton Inc. is one of many available models that may be used. The modification entails adding a small, black foam ball (30 milimeter diameter) near the tip of the laser-pointer. The ball is useful in that it facilitates more accurate recognition of the pointer and provides a basis for determining the direction of pointing. Alternative embodiments of the invention may employ other types of pointing devices with or without light-emitting capabilities.
5. A projector [15] to project the Computer Display Image onto the Surface. The projector is not essential to the operation of the invention, however, the invention will typically be more useful when one is employed. Any conventional LCD, DLP or other type of
Method and Apparatus for Visual Pointing and Computer Control
projector may be used. For example Proxima 2710, 5800 or InFocus 425, 725, 735 and many others. 6. Communications interface [16] connecting the computer to some communications infrastructure. This component is required only when the invention is employed for remote presentations or similar uses. This may comprise, for example, a local area network controller or a modem interface. Tegrity Inc. will market the laser-pointer device and computer-software that implement the preferred embodiment of the invention under the name InterPointer (see the product brochure).
Usage Models
In the preferred embodiment of the invention the spot of light generated by the laser-pen is detected as a "pointing event." Similarly, when the light is moved to another location or when it is turned off are each detected as a unique event. In addition, rapid sequencing of the light on and off multiple times can be detected as yet another set of events. These capabilities are currently used to support two distinct usage models for the invention, although additional models and applications are easily adapted without significant modifications:
1. "Mouse Emulation" - the light spot simulates the left mouse button on a standard computer mouse device. Appearance of the spot simulates pressing of the left mouse button, moving it simulates moving the mouse and disappearance of the spot simulates releasing the mouse button. This enables emulating the common mouse commands "click" and "drag-and-drop" that act upon the location at which the spot appeared. When the light is turned on and off twice in succession the "double-click" command can be emulated.
Method and Apparatus for Visual Pointing and Computer Control
2. "Pointing" (e.g. for remote presentations) - when the light spot appears it indicates that a cursor should be displayed at the corresponding location (at remote sites). Similarly, when it moves, the cursor should be moved and when it disappears so should the cursor display. In this model directional information is used so that the cursor may indicate the direction in which the user is pointing.
Overview of System Operation
The system obtains images from the Image Sensor [12] at a rate of approximately 30 images per second. The large amount of information supplied by the Image Sensor (typically over 18 million bytes per second) and the need to consume a minimal amount of processing resources dictate a -strategy of analysis that is not monolithic. Rather, the analysis is broken into several phases such that each phase performs more intensive processing on less information than the prior phase. In general, these phases include:
1. Preliminary Screening
2. Detection
3. Verification
Each processing phase comprises one or more functional modules as described below. An additional module, the "Hot-Spot Tracker," is executed at a reduced frequency (twice per second).
When the system determines that a pointing event has occurred at a specific position in the Viewed Image it sends a "system event" or activates a predetermined system function using techniques that are specific to the computer operating system in use (for example, the "mouse_event()" in Microsoft Window-^). When doing so, the display coordinates of the position at which the event should be activated are also provided. These coordinates are obtained by
Method and Apparatus for Visual Pointing and Computer Control
converting the Viewed Image coordinates to Computer Display Image coordinates by means of the Warping transformation. This transformation is initially computed during a calibration procedure, which is explained in detail in the documents incorporated herein by reference.
In order to use computing resources in an optimal manner the system operates as a "finite state machine." This means that at any given time the system will be in one of several "states." Specifically, the states currently used are "off," "suspected," and "on." Each state defines what analysis will be performed on the next image as well as the possible states that the system may enter as a result of this analysis. Specifically, the Preliminary Screening phase of processing is considered necessary only when the system is "off," i.e. when no pointing activity has been taking place. Once pointing is "suspected" (after Preliminary Screening determines that there may be light -spots in the Viewed Image) this phase is not necessary because the image regions where light spots may occur will not differ from one processing to the next. When the system has determined with certainty that a pointing event is taking place (after several successive cycles so as to verify this decision over time) it switches to "on" state. Similarly, when the system determines there is no activity it will switch back from "on" to "suspected" and from there to "off." We now describe the processing phases listed above in further detail.
Preliminary Screening
The system must detect the activation of the pointing device in "real-time," i.e. fast enough to follow typical human activity. In the preferred embodiment of the invention this involves detecting the spot of light produced by a laser-pointer device and determining if it has been turned on and off once or even twice. The latter case would typically be useful for supporting the simulation of mouse "double-click" events. This requires the system to process a large amount of Method and Apparatus for Visual Pointing and Computer Control
information (640x480 pixels for a typical camera) every 33 milliseconds. Full analysis of so much information is not practical using state-of-the-art personal computers given that this invention must coexist with other applications being driven by the computer without noticeably impacting their performance. The solution to this problem is to reduce the amount of information and complexity of processing in this preliminary phase. The system performs a fast preliminary screening to detect small areas ("regions of interest") that are suspected of containing a laser's spot of light. In later processing of this image-cycle, the system will "focus" only on these small regions of interest (typically 60x60 image-pixels). Figure 4 shows a small image region that demonstrates how Preliminary Screening works. Figure 4(a) shows a pixel image (without color) of a region in which the laser device [41] is activated. The foam ball used by this embodiment of the invention can be seen [42] as can the hand of the user [43]. The spot of light that the laser device produces is readily visible [44].
Preliminary screening works in the following manner. The Viewed Image is scanned for pixels that have an intensity value above a near-white threshold. Scanning all pixels is time consuming, therefore the process skips over most pixels, sampling only 1 out of every 16 pixels. This may be done without missing light spots because the minimal size of the spot is 4x4 pixels for all practical applications (using a 640x480 pixel Viewed Image). Figure 4(b) and 4(c) demonstrate how this process operates on the image region of Figure 4(a). In Figure 4(b) each black dot represents a pixel coordinate position at which the screening process will sample the value of the corresponding pixel from the image of Figure 4(a). This selective sampling produces the image of Figure 4(c). This image contains 1/16th of the information contained in the image of Figure 4(a) however the spot of laser light is still readily apparent at [46].
Method and Apparatus for Visual Pointing and Computer Control
The sampled pixels that are bright enough (high intensity) and are in close proximity to each other are subsequently treated as a single region of interest with a suspected light-spot. In the case of Figure 4, the bright spot of [46] will be identified as such a suspected light spot. Note that Figure 4(a) shows only a small image area, which is, in fact, a "suspected region." In practice, Preliminary Screening is used on the entire Viewed Image precisely in order to locate and identify such regions in the much larger image.
The process of preliminary screening ignores certain image-pixels and does not add them to suspected regions even if they exceed the intensity threshold. The set of ignored pixels is determined by the Hot-Spot Tracking module as described below. If no suspected regions are found, processing is complete for this cycle and the system may change its state accordingly.
Detection
This module recognizes a pattern of a spot of laser-light hitting the Surface. It can detect any kind of lasers (635nm to 650nm wavelengths). Surprisingly a laser-dot as captured by a typical video camera is not red. Actually it appears as a white spot due to the high intensity of the laser. This module operates on predefined regions of interest as follows: It thresholds the image with a high threshold, keeping only pixels of high intensity. The binarized image is then segmented into connected blobs. Then the algorithm looks for a blob that best matches the pattern of an expected spot of light in its dimensions, measure of roundness. These steps use techniques that are well known in the art. The end result is that either no blobs are determined to match the expected characteristics of a light spot or a single blob is selected. In the former case processing is complete with negative results (no activation) and the system may change state accordingly.
Method and Apparatus for Visual Pointing and Computer 'Control
Verification
Verification is necessary to prevent false triggering caused by reflective objects like watches and jewelry or by saturation of small regions of the Viewed Image due to the Image Sensor's dynamic-range. The latter is commonly seen when presenting the Image Sensor with a display containing high contrast (typically with Automatic Gain Compensation settings used by most video cameras). A major drawback of similar devices introduced in the past was the lack of such verification procedures and relying solely on an intensity threshold to conclude whether or not pointing was activated.
Two distinct kinds of verification may be used in the preferred embodiment of the invention:
1. Fiducial-based Verification: This method is used to recognize an a priori known figure or pattern known as a "fiducial". In our case the fiducial is the black foam ball attached to the laser-pointer device as shown in Figure 3. The ball is detected as follows. A small region of interest in the Viewed Image (about 60x60 pixels) surrounding the detected spot is analyzed. In this region the algorithm looks for the ball using a simple "correlation" with an idealized ball. If the correlation indeed finds a ball, the image is binarized by its intensity, using a threshold that is determined during the correlation check (utilizing the knowledge that the ball is black). The "blob" that defines the ball's pixels is checked to be round and of fitting dimensions and proportions. Once the system locates the fiducial the direction in which the user is pointing is easily determined.
2. Skin-based Verification: This method is used to verify that human skin is apparent in the Viewed Image in close proximity to the spot at which light was detected. This is based on the assumption that when activating the pointer the user's hand is near the light spot
Method and Apparatus for Visual Pointing and Computer Control
and is visible in the Viewed Image. The verification procedure is based on a "skin detector" function, which detects pixels that have a high likelihood of corresponding to human skin. The algorithm uses a statistical model of human-skin color and is based on current research techniques in this field (see "Visual Tracking for Multimodal Human Computer Interaction" - Jie Yang, Rainer Stiefelhagen, Uwe Meier, Alex Waibel at CHI98).
Hot-Spot Tracking
One problem that the system must deal with is the presence of "hot-spots" in the Viewed Image. Hot-spots are regions of the image that appear saturated, i.e. where pixels are assigned the -highest possible intensity values (white) due to the relative contrast of that area with other areas of the viewed Surface. For example, the reflection of a projector's beam, which may be blinding to a viewer, will typically produce this effect in a digital image. Similarly, in some cases projecting a highly contrasting image (such as one with a dark background and a small amount of bright pixels) may produce hot-spots. In these areas differences of intensity cannot be distinguished, therefore the light-spot of a laser will not be detected within them. The main problem is that hot-spots may sometimes appear similar in characteristics (size, shape, intensity) to typical light spots, thus generating false triggering of this invention.
The Hot-Spot Tracking module solves this problem by continuously tracking and "learning" which pixels belong to hot-spots. These pixels are remembered by saving a "mask" which represents locations to be ignored by the Preliminary Screening phase described above. The idea is to compute and maintain a running average image of the last few seconds and threshold it to produce a binary mask. Pixels that are steadily saturated will quickly be marked in this mask and will be ignored in
Method and Apparatus for Visual Pointing and Computer Control
subsequent processing cycles. If and when such pixels cease to be saturated (e.g., the projected image changes or room lighting changes), the system will quickly adapt to the change and the pixels be no longer be marked in the mask, hence they will again participate in the analysis.
This module is activated at a low frequency - approximately twice per second. This rate (along with the averaging coefficient used for computing the running average) is quick enough to prevent false triggering by hot-spots yet slow enough to have a negligible impact on performance.
The Tegrity InterPointer
The Tegrity InterPointer can be used in place of a mouse, to draw elements or make menu selections directly on the white board, and as a pointing device that can be broadcast during WebLearner distance learning.
To use the InterPointer, the InterPointer Add-on software must be installed, and a serial number entered. This is done through the installation program on the Tegrity Digital Flipchart CD. See your Tegrity Digital Flipchart User's Guide, chapter 4, for installation instructions.
Please remember to insert the 2 (included) batteries into the pointer according to the instructions.
Once the InterPointer software is installed it must be enabled and activated. Enabling the InterPointer for Mouse Emulation
O Select Options from the Tegrity pull down menu and bring the General tab forward. Click on the Enable InterPointer option. θ Access the Tegrity Touch Panel θ Right click on the Touch Panel and select InterPointer.
Or Tieϋ l Use the InterPointer to click on the able/disable button on the top of the Touch Panel.
A checkmark displays beside the option in the panel.
You can use the InterPointer to control any buttons or menus, from the whiteboard.
Enabling the InterPointer for the WebLearner
• Select Options from the Tegrity pull down menu and bring the General tab forward. Click on the Enable InterPointer option.
From now on the InterPointer will be activated automatically each time you work with WebLearner.
Wherever you point with the InterPointer on the whiteboard, the student sees a small hand in the same location on the web class page. This hand is streamed with the live class and is saved with the recorded class file. This way you can show your students specific points on the whiteboard while you talk about them during your class.
Using the InterPointer
Warning: Avoid direct eye exposure! Never look into the ray emitted by the InterPointer.
O Keep the InterPointer as close to the whiteboard as possible. The tip of the InterPointer should almost be touching the board. In order for the camera to see the InterPointer, the "ball" must be inside the projected area. θ Press down on the Clip while pointing ..> with the InterPointer, as shown in the figure beside. :

Claims

WHAT IS CLAIMED IS:
1. A method of controlling the display, transfer and processing of information in computer systems, said method comprising the step of a user pointing at a wll surface which contains information to activate pre-determined operations relating to that information.
PCT/US2000/007118 1999-03-17 2000-03-17 Method and apparatus for visual pointing and computer control WO2000058933A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU38949/00A AU3894900A (en) 1999-03-17 2000-03-17 Method and apparatus for visual pointing and computer control

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12472899P 1999-03-17 1999-03-17
US60/124,728 1999-03-17

Publications (2)

Publication Number Publication Date
WO2000058933A1 true WO2000058933A1 (en) 2000-10-05
WO2000058933B1 WO2000058933B1 (en) 2001-01-11

Family

ID=22416511

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/007118 WO2000058933A1 (en) 1999-03-17 2000-03-17 Method and apparatus for visual pointing and computer control

Country Status (2)

Country Link
AU (1) AU3894900A (en)
WO (1) WO2000058933A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006054207A1 (en) * 2004-11-16 2006-05-26 Koninklijke Philips Electronics N.V. Touchless manipulation of images for regional enhancement
WO2008156457A1 (en) * 2007-06-20 2008-12-24 Thomson Licensing Interactive display with camera feedback
US8276077B2 (en) 2009-07-10 2012-09-25 The Mcgraw-Hill Companies, Inc. Method and apparatus for automatic annotation of recorded presentations
CN104899361A (en) * 2015-05-19 2015-09-09 华为技术有限公司 Remote control method and apparatus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3885096A (en) * 1972-07-15 1975-05-20 Fuji Photo Film Co Ltd Optical display device
US5181015A (en) * 1989-11-07 1993-01-19 Proxima Corporation Method and apparatus for calibrating an optical computer input system
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
US5712658A (en) * 1993-12-28 1998-01-27 Hitachi, Ltd. Information presentation apparatus and information display apparatus
US5793361A (en) * 1994-06-09 1998-08-11 Corporation For National Research Initiatives Unconstrained pointing interface for natural human interaction with a display-based computer system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3885096A (en) * 1972-07-15 1975-05-20 Fuji Photo Film Co Ltd Optical display device
US5181015A (en) * 1989-11-07 1993-01-19 Proxima Corporation Method and apparatus for calibrating an optical computer input system
US5712658A (en) * 1993-12-28 1998-01-27 Hitachi, Ltd. Information presentation apparatus and information display apparatus
US5793361A (en) * 1994-06-09 1998-08-11 Corporation For National Research Initiatives Unconstrained pointing interface for natural human interaction with a display-based computer system
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006054207A1 (en) * 2004-11-16 2006-05-26 Koninklijke Philips Electronics N.V. Touchless manipulation of images for regional enhancement
US8473869B2 (en) 2004-11-16 2013-06-25 Koninklijke Philips Electronics N.V. Touchless manipulation of images for regional enhancement
WO2008156457A1 (en) * 2007-06-20 2008-12-24 Thomson Licensing Interactive display with camera feedback
US8276077B2 (en) 2009-07-10 2012-09-25 The Mcgraw-Hill Companies, Inc. Method and apparatus for automatic annotation of recorded presentations
CN104899361A (en) * 2015-05-19 2015-09-09 华为技术有限公司 Remote control method and apparatus
EP3096489A1 (en) * 2015-05-19 2016-11-23 Huawei Technologies Co., Ltd. Remote control method and apparatus
US9785266B2 (en) 2015-05-19 2017-10-10 Huawei Technologies Co., Ltd. Remote control method and apparatus
CN104899361B (en) * 2015-05-19 2018-01-16 华为技术有限公司 A kind of remote control method and device

Also Published As

Publication number Publication date
WO2000058933B1 (en) 2001-01-11
AU3894900A (en) 2000-10-16

Similar Documents

Publication Publication Date Title
Kirstein et al. Interaction with a projection screen using a camera-tracked laser pointer
JP4323180B2 (en) Interface method, apparatus, and program using self-image display
US8818027B2 (en) Computing device interface
EP0771460B1 (en) Interactive projected video image display system
EP2049979B1 (en) Multi-user pointing apparatus and method
US10015402B2 (en) Electronic apparatus
US6414672B2 (en) Information input apparatus
JP3419050B2 (en) Input device
EP0686935A1 (en) Pointing interface
US20010030668A1 (en) Method and system for interacting with a display
WO2019033957A1 (en) Interaction position determination method and system, storage medium and smart terminal
US20140247216A1 (en) Trigger and control method and system of human-computer interaction operation command and laser emission device
US20140145941A1 (en) Computer vision gesture based control of a device
KR20040063153A (en) Method and apparatus for a gesture-based user interface
US20090115971A1 (en) Dual-mode projection apparatus and method for locating a light spot in a projected image
Cavens et al. Interacting with the big screen: pointers to ponder
US20140053115A1 (en) Computer vision gesture based control of a device
WO2000058933A1 (en) Method and apparatus for visual pointing and computer control
JP6834197B2 (en) Information processing equipment, display system, program
US20170357336A1 (en) Remote computer mouse by camera and laser pointer
CN110620955A (en) Live broadcasting system and live broadcasting method thereof
Kim et al. Multi-touch interaction for table-top display
CN112462939A (en) Interactive projection method and system
Kim et al. New interface using palm and fingertip without marker for ubiquitous environment
WO2001046941A1 (en) Method and apparatus for vision-based coupling between pointer actions and projected images

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: B1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: B1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

B Later publication of amended claims
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWE Wipo information: entry into national phase

Ref document number: 09936866

Country of ref document: US

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP