US20040046744A1 - Method and apparatus for entering data using a virtual input device - Google Patents

Method and apparatus for entering data using a virtual input device Download PDF

Info

Publication number
US20040046744A1
US20040046744A1 US10651919 US65191903A US2004046744A1 US 20040046744 A1 US20040046744 A1 US 20040046744A1 US 10651919 US10651919 US 10651919 US 65191903 A US65191903 A US 65191903A US 2004046744 A1 US2004046744 A1 US 2004046744A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
user
input device
system
virtual input
controlled object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10651919
Inventor
Abbas Rafii
Cyrus Bamji
Nazim Kareemi
Shiraz Shivji
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canesta Inc
Microsoft Technology Licensing LLC
Original Assignee
Canesta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1632External expansion units, e.g. docking stations
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1673Arrangements for projecting a virtual keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0221Arrangements for reducing keyboard size for transport or storage, e.g. foldable keyboards, keyboards with collapsible keys
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/20Image acquisition
    • G06K9/22Image acquisition using hand-held instruments
    • G06K9/222Image acquisition using hand-held instruments the instrument generating sequences of position coordinates corresponding to handwriting; preprocessing or recognising digital ink
    • G06K9/224Image acquisition using hand-held instruments the instrument generating sequences of position coordinates corresponding to handwriting; preprocessing or recognising digital ink in three dimensions

Abstract

A user with hand or stylus inputs information to a companion system such as a PDA, a cell telephone, an appliance, or other device using a virtual input device such as an image of a keyboard. A sensor captures data representing a single image at a given time, from which data three-dimensional positional information as to if, when in time, and where on the virtual input device user-interaction or contact occurred. The processed digital information is output to the companion system. In a virtual keyboard application, the companion system can display an image of a keyboard, including an image of a keyboard showing user fingers, and/or alphanumeric text as such data is input by the user on the virtual input device.

Description

    RELATION TO PREVIOUSLY FILED APPLICATION
  • This is a continuation of co-pending U.S. utility patent application Ser. No. 09/502,499, filed on Feb. 11, 2000, which will issue as U.S. Pat. No. 6,614,422 on Sep. 2, 2003. The '499 application claimed priority from U.S. provisional patent application, serial No. 60/163,445, filed on Nov. 4, 1999 entitled “Method and Device for 3D Sensing of Input Commands to Electronic Devices”, in which applicants herein were applicants therein. The '499 application also referenced applicant Bamji's then co-pending U.S. patent application Ser. No. 09/401,059 filed on Sep. 22, 1999, entitled “CMOS-COMPATIBLE THREE-DIMENSIONAL IMAGE SENSOR IC”, which '059 application issued as U.S. Pat. No. 6,323,942 on Nov. 27, 2002. Each of these applications and U.S. patents was assigned to common assignee herein Canasta, Inc.[0001]
  • FIELD OF THE INVENTION
  • The invention relates generally to inputting commands and/or data (collectively, referred to herein as “data”) to electronic systems including computer systems. More specifically, the invention relates to methods and apparatuses for inputting data when the form factor of the computing device precludes using normally sized input devices such as a keyboard, or when the distance between the computing device and the input device makes it inconvenient to use a conventional input device coupled by cable to the computing device. [0002]
  • BACKGROUND OF THE INVENTION
  • Computer systems that receive and process input data are well known in the art. Typically such systems include a central processing unit (CPU), persistent read only memory (ROM), random access memory (RAM), at least one bus interconnecting the CPU, the memory, at least one input port to which a device is coupled input data and commands, and typically an output port to which a monitor is coupled to display results. Traditional techniques for inputting data have included use of a keyboard, mouse, joystick, remote control device, electronic pen, touch panel or pad or display screen, switches and knobs, and more recently handwriting recognition, and voice recognition. [0003]
  • Computer systems and computer-type systems have recently found their way into a new generation of electronic devices including interactive TV, set-top boxes, electronic cash registers, synthetic music generators, handheld portable devices including so-called personal digital assistants (PDA), and wireless telephones. Conventional input methods and devices are not always appropriate or convenient when used with such systems. [0004]
  • For example, some portable computer systems have shrunk to the point where the entire system can fit in a user's hand or pocket. To combat the difficulty in viewing a tiny display, it is possible to use a commercially available virtual display accessory that clips onto an eyeglass frame worn by the user of the system. The user looks into the accessory, which may be a 1″ VGA display, and sees what appears to be a large display measuring perhaps 15″ diagonally. [0005]
  • Studies have shown that use of a keyboard and/or mouse-like input device is perhaps the most efficient technique for entering or editing data in a companion computer or computer-like system. Unfortunately it has been more difficult to combat the problems associated with a smaller size input device, as smaller sized input devices can substantially slow the rate with which data can be entered. For example, some PDA systems have a keyboard that measures about 3″×7″. Although data and commands may be entered into the PDA via the keyboard, the entry speed is reduced and the discomfort level is increased, relative to having used a full sized keyboard measuring perhaps 6″×12″. Other PDA systems simply eliminate the keyboard and provide a touch screen upon which the user writes alphanumeric characters with a stylus. Handwriting recognition software within the PDA then attempts to interpret and recognize alphanumeric characters drawn by the user with a stylus on a touch sensitive screen. Some PDAs can display an image of a keyboard on a touch sensitive screen and permit users to enter data by touching the images of various keys with a stylus. In other systems, the distance between the user and the computer system may preclude a convenient use of wire-coupled input devices, for example the distance between a user and a set-top box in a living room environment precludes use of a wire-coupled mouse to navigate. [0006]
  • Another method of data and command input to electronic devices is recognizing visual images of user actions and gestures that are then interpreted and converted to commands for an accompanying computer system. One such approach was described in U.S. Pat. No. 5,767,842 to Korth (1998) entitled “Method and Device for Optical Input of Commands or Data”. Korth proposed having a computer system user type on an imaginary or virtual keyboard, for example a keyboard-sized piece of paper bearing a template or a printed outline of keyboard keys. The template is used to guide the user's fingers in typing on the virtual keyboard keys. A conventional TV (two-dimensional) video camera focused upon the virtual keyboard was stated to somehow permit recognition of what virtual key (e.g., printed outline of a key) was being touched by the user's fingers at what time as the user “typed” upon the virtual keyboard. [0007]
  • But Korth's method is subject to inherent ambiguities arising from his reliance upon relative luminescence data, and indeed upon an adequate source of ambient lighting. While the video signal output by a conventional two-dimensional video camera is in a format that is appropriate for image recognition by a human eye, the signal output is not appropriate for computer recognition of viewed images. For example, in a Korth-type application, to track position of a user's fingers, computer-executable software must determine contour of each finger using changes in luminosity of pixels in the video camera output signal. Such tracking and contour determination is a difficult task to accomplish when the background color or lighting cannot be accurately controlled, and indeed may resemble the user's fingers. Further, each frame of video acquired by Korth, typically at least 100 pixels×100 pixels, only has a grey scale or color scale code (typically referred to as RGB). Limited as he is to such RGB value data, a microprocessor or signal processor in a Korth system at best might detect the contour of the fingers against the background image, if ambient lighting conditions are optimal. [0008]
  • The attendant problems are substantial as are the potential ambiguities in tracking the user's fingers. Ambiguities are inescapable with Korth's technique because traditional video cameras output two-dimensional image data, and do not provide unambiguous information about actual shape and distance of objects in a video scene. Indeed, from the vantage point of Korth's video camera, it would be very difficult to detect typing motions along the axis of the camera lens. Therefore, multiple cameras having different vantage points would be needed to adequately capture the complex keying motions. Also, as suggested by Korth's FIG. 1, it can be difficult merely to acquire an unobstructed view of each finger on a user's hands, e.g., acquiring an image of the right forefinger is precluded by the image-blocking presence of the right middle finger, and so forth. In short, even with good ambient lighting and a good vantage point for his camera, Korth's method still has many shortcomings, including ambiguity as to what row on a virtual keyboard a user's fingers is touching. [0009]
  • In an attempt to gain depth information, the Korth approach may be replicated using multiple two-dimensional video cameras, each aimed toward the subject of interest from a different viewing angle. Simple as this proposal sounds, it is not practical. The setup of the various cameras is cumbersome and potentially expensive as duplicate cameras are deployed. Each camera must be calibrated accurately relative to the object viewed, and relative to each other. To achieve adequate accuracy the stereo cameras would like have to be placed at the top left and right positions relative to the keyboard. Yet even with this configuration, the cameras would be plagued by fingers obstructing fingers within the view of at least one of the cameras. Further, the computation required to create three-dimensional information from the two-dimensional video image information output by the various cameras contributes to the processing overhead of the computer system used to process the image data. Understandably, using multiple cameras would substantially complicate Korth's signal processing requirements. Finally, it can be rather difficult to achieve the necessary camera-to-object distance resolution required to detect and recognize fine object movements such as a user's fingers while engaged in typing motion. [0010]
  • In short, it may not be realistic to use a Korth approach to examine two-dimensional luminosity-based video images of a user's hands engaged in typing, and accurately determine from the images what finger touched what key (virtual or otherwise) at what time. This shortcoming remains even when the acquired two-dimensional video information processing is augmented with computerized image pattern recognition as suggested by Korth. It is also seen that realistically Korth's technique does not lend itself to portability. For example, the image acquisition system and indeed an ambient light source will essentially be on at all times, and will consume sufficient operating power to preclude meaningful battery operation. Even if Korth could reduce or power down his frame rate of data acquisition to save some power, the Korth system still requires a source of adequate ambient lighting. [0011]
  • Power considerations aside, Korth's two-dimensional imaging system does not lend itself to portability with small companion devices such as cell phones because Korth's video camera (or perhaps cameras) requires a vantage point above the keyboard. This requirement imposes constraints on the practical size of Korth's system, both while the system is operating and while being stored in transit. [0012]
  • What is needed is a method and system by which a user may input data to a companion computing system using a virtual keyboard or other virtual input device that is not electrically connected to the computing system. The data input interface emulation implemented by such method and system should provide meaningful three-dimensionally acquired information as to what user's finger touched what key (or other symbol) on the virtual input device, in what time sequence, preferably without having to use multiple image-acquiring devices. Preferably such system should include signal processing such that system output can be in a scan-code or other format directly useable as input by the companion computing system. Finally, such system should be portable, and easy to set up and operate [0013]
  • The present invention provides such a method and system. [0014]
  • SUMMARY OF THE INVENTION
  • The present invention enables a user to input commands and data (collectively, referred to as data) from a passive virtual emulation of a manual input device to a companion computer system, which may be a PDA, a wireless telephone, or indeed any electronic system or appliance adapted to receive digital input signals. The invention includes a three-dimensional sensor imaging system that functions even without ambient light to capture in real-time three-dimensional data as to placement of a user's fingers on a substrate bearing or displaying a template that is used to emulate an input device such as a keyboard, keypad, or digitized surface. The substrate preferably is passive and may be a foldable or rollable piece of paper or plastic containing printed images of keyboard keys, or simply indicia lines demarking where rows and columns for keyboard keys would be. The substrate may be defined as lying on a horizontal X-Z plane where the Z-axis define template key rows, and the X-axis defines template key columns, and where the Y-axis denotes vertical height above the substrate. If desired, in lieu of a substrate keyboard, the invention can include a projector that uses light to project a grid or perhaps an image of a keyboard onto the work surface in front of the companion device. The projected pattern would serve as a guide for the user in “typing” on this surface. The projection device preferably would be included in or attachable to the companion device. [0015]
  • Alternatively, the substrate can be eliminated as a typing guide. Instead the screen of the companion computer device may be used to display alphanumeric characters as they are “typed” by the user on a table top or other work surface (perhaps a table top) in front of the companion device. For users who are not accomplished touch typists, the invention can instead (or in addition) provide a display image showing keyboard “keys” as they are “pressed” or “typed upon” by the user. “Keys” perceived to be directly below the user's fingers can be highlighted in the display in one color, whereas “keys” perceived to be actually activated can be highlighted in another color or contrast. This configuration would permit the user to type on the work surface in front of the companion device or perhaps on a virtual keyboard. Preferably as the user types on the work surface or the virtual keyboard, the corresponding text appears on a text field displayed on the companion device. [0016]
  • Thus, various forms of feedback can be used to guide the user in his or her virtual typing. What fingers of the user's hands have “typed” upon what virtual key or virtual key position in what time order is determined by the three-dimensional sensor system. Preferably the three-dimensional sensor system includes a signal processing unit comprising a central processor unit (CPU) and associated read only memory (ROM) and random access memory (ROM). Stored in ROM is a software routine executed by the signal processing unit CPU such that three-dimensional positional information is received and converted substantially in real-time into key-scan data or other format data directly compatible as device input to the companion computer system. Preferably the three-dimensional sensor emits light of a specific wavelength, and detects return energy time-of-flight from various surface regions of the object being scanned, e.g., a user's hands. [0017]
  • At the start of a typing session, the user will put his or her fingers near or on the work surface or virtual keyboard (if present). Until the user or some other object comes within imaging range of the three-dimensional sensor, the present invention remains in a standby, low power consuming, mode. In standby mode, the repetition rate of emitted optical pulses is slowed to perhaps 1 to perhaps 10 pulses per second, to conserve operating power, an important consideration if the invention is battery powered. As such, the invention will emit relatively few pulses but can still acquire image data, albeit having crude or low Z-axis resolution. In alternate methods for three-dimensional capture, methods that reduce the acquisition frame rate and resolution to conserve power may be used. Nonetheless such low resolution information is sufficient to at least alert the present invention to the presence of an object within the imaging field of view. When an object does enter the imaging field of view, a CPU that governs operation of the present invention commands entry into a normal operating mode in which a high pulse rate is employed and system functions are now operated at full power. To preserve operating power, when the user's fingers or other potentially relevant object is removed from the imaging field of view, the present invention will power down, returning to the standby power mode. Such powering down preferably also occurs when it is deemed that relevant objects have remained at rest for an extended period of time exceeding a time threshold. [0018]
  • Assume that now the user has put his or her fingers on all of the home row keys (e.g., A, S, D, F, J, K, L, :) of the virtual keyboard (or if no virtual keyboard is present, on a work space in front of the companion device with which the invention is practiced). The present invention, already in full power mode will now preferably initiate a soft key calibration in which the computer assigns locations to keyboard keys based upon user input. The user's fingers are placed on certain (intended) keys, and based on the exact location of the fingers, the software assigns locations to the keys on the keyboard based upon the location of the user's fingers. [0019]
  • The three-dimensional sensor system views the user's fingers as the user “types” on the keys shown on the substrate template, or as the user types on a work space in front of the companion device, where “keys” would normally be if a real keyboard were present. The sensor system outputs data to the companion computer system in a format functionally indistinguishable from data output by a conventional input device such as a keyboard, a mouse, etc. Software preferably executable by the signal processing unit CPU (or by the CPU in the companion computer system) processes the incoming three-dimensional information and recognizes the location of the user's hands and fingers in three-dimensional space relative to the image of a keyboard on the substrate or work surface (if no virtual keyboard is present). [0020]
  • Preferably the software routine identifies the contours of the user's fingers in each frame by examining Z-axis discontinuities. When a finger “types” a key, or “types” in a region of a work surface where a key would be if a keyboard (real or virtual) were present, a physical interface between the user's finger and the virtual keyboard or work surface is detected. The software routine examines preferably optically acquired data to locate such an interface boundary in successive frames to compute Y-axis velocity of the finger. (In other embodiments, lower frequency energy such as ultrasound might instead be used.) When such vertical finger motion stops or, depending upon the routine, when the finger makes contact with the substrate, the virtual key being pressed is identified from the (Z, X) coordinates of the finger in question. An appropriate KEYDOWN event command may then be issued. The present invention performs a similar analysis on all fingers (including thumbs) to precisely determine the order in which different keys are contacted (e.g., are “pressed”). In this fashion, the software issues appropriate KEYUP, KEYDOWN, and scan code data commands to the companion computer system. [0021]
  • The software routine preferably recognizes and corrects for errors in a drifting of the user's hands while typing, e.g., a displacement on the virtual keyboard. The software routine further provides some hysteresis to reduce error resulting from a user resting a finger on a virtual key without actually “pressing” the key. The measurement error is further reduced by observing that in a typing application, the frame rate requirement for tracking Z-values is lower than the frame rate requirement for tracking X-values and Y-Values. That is, finger movement in Z-direction is typically slower than finger movements in other axes. The present invention also differentiates between impact time among different competing fingers on the keyboard or other work surface. Preferably such differentiation is accomplished by observing X-axis, Y-axis data values at a sufficiently high frame rate, as it is Y-dimension timing that is to be differentiated. Z-axis observations need not discriminate between different fingers, and hence the frame rate can be governed by the speed with which a single finger can move between different keys in the Z-dimension. Preferably the software routine provided by the invention averages Z-axis acquired data over several frames to reduce noise or jitter. While the effective frame rate for Z-values is decreased relative to effective frame rate for X-values and for Y-values, accuracy of Z-values is enhanced and a meaningful frame rate of data acquisition is still obtained. [0022]
  • The software routine can permit the user to toggle the companion computer system from say alphanumeric data input mode to graphics mode simply by “typing” on certain key combinations, perhaps simultaneously pressing the Control and Shift In graphics mode, the template would emulate a digitizer table, and as the user dragged his or her finger across the template, the (Z, X) locus of points being contacted would be used to draw a line, a signature, or other graphic that is into the companion computer system. [0023]
  • Preferably a display associated with the companion computer system can display alphanumeric or other data input by the user substantially in real-time. In addition to depicting images of keyboard keys and fingers, the companion computer system display can provide a block cursor that shows the alphanumeric character that is about to be entered. An additional form of input feedback is achieved by forming a resilient region under some or all of the keys to provide tactile feedback when a “key” is touched by the user's fingers. If a suitable companion device were employed, the companion device could even be employed to enunciate aloud the names of “typed” keys, letter-by-letter, e.g., enunciating the letters “c”-“a”-“t” as the word “cat” was typed by a user. A simpler form of acoustic feedback is provided by having the companion device emit electronic key-click sounds upon detecting a user's finger depressing a virtual key. [0024]
  • Other features and advantages of the invention will appear from the following description in which the preferred embodiments have been set forth in detail, in conjunction with the accompanying drawings.[0025]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A depicts a three-dimensional sensor system used with a passive substrate keyboard template, according to the present invention; [0026]
  • FIG. 1B depicts a three-dimensional sensor system that may be used without a substrate keyboard template, according to the present invention; [0027]
  • FIG. 1C depicts a companion device display of a virtual keyboard showing a user's finger contacting a virtual key, according to the present invention; [0028]
  • FIG. 1D depicts the display of FIG. 1C, showing in additional text entered by the user on a virtual keyboard, according to the present invention; [0029]
  • FIG. 2A depicts a passive substrate in a partially folded disposition, according to the present invention; [0030]
  • FIG. 2B depicts a passive substrate, bearing a different character set, in a partially rolled-up disposition, according to the present invention; [0031]
  • FIG. 3 is a block diagram of an exemplary implementation of a three-dimensional signal processing and sensor system, with which the present invention may be practiced; [0032]
  • FIG. 4 is a block diagram of an exemplary single pixel detector with an associated photon pulse detector and high speed counter as may be used in a three-dimensional sensor system with which the present invention may be practiced; [0033]
  • FIG. 5 depicts contour recognition of a user's fingers, according to the present invention; [0034]
  • FIG. 6 depicts use of staggered key locations in identifying a pressed virtual key, according to the present invention; [0035]
  • FIGS. [0036] 7A-7O depict cluster matrices generated from optically acquired three-dimensional data for use in identifying user finger location, according to the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • FIG. 1A depicts a three-dimensional sensor system [0037] 10 comprising a three-dimensional sensor 20 focused essentially edge-on towards the fingers 30 of a user's hands 40, as the fingers “type” on a substrate 50, shown here atop a desk or other work surface 60. Substrate 50 preferably bears a printed or projected template 70 comprising lines or indicia representing a data input device, for example a keyboard. As such, template 70 may have printed images of keyboard keys, as shown, but it is understood the keys are electronically passive, and are merely representations of real keys. Substrate 50 is defined as lying in a Z-X plane in which various points along the X-axis relate to left-to-right column locations of keys, various points along the Z-axis relate to front-to-back row positions of keys, and Y-axis positions relate to vertical distances above the Z-X plane. It is understood that (X,Y,Z) locations are a continuum of vector positional points, and that various axis positions are definable in substantially more than few number of points indicated in FIG. 1A.
  • If desired, template [0038] 70 may simply contain row lines and column lines demarking where keys would be present. Substrate 50 with template 70 printed or otherwise appearing thereon is a virtual input device that in the example shown emulates a keyboard. As such substrate 50 and/or template 70 may be referred to herein as a virtual keyboard or virtual device for inputting digital data and/or commands. An advantage of such a virtual input device is that it may be printed on paper or flexible plastic and folded as shown in FIG. 2A, or rolled-up (or folded and rolled-up) as shown in FIG. 2B. It is understood that the arrangement of keys need not be in a rectangular matrix as shown for ease of illustration in several of the figures, but may be laid out in staggered or offset positions as in a real QWERTY keyboard. FIG. 2B also shows the device with an alternate keyset printed as template 70, here Cyrillic alphabet characters. If desired, one keyset could be printed on one side of the template, and a second keyset on the other, e.g., English and Russian characters.
  • As described with respect to FIGS. [0039] 1B-1D, alternatively an image of a virtual keyboard may be displayed on the screen associated with the companion device. In this embodiment, the substrate and even the work surface can be dispensed with, permitting the user to “type” in thin air, if desired. This embodiment is especially flexible in permitting on-the-fly changes in the “keyboard” being used, e.g., presenting an English language keyboard, or a German language keyboard, a Russian language keyboard, to emulate a digitizer sheet, etc. The various keyboards and keysets are simply displayed on screen 90, associated with companion device or appliance 80. Understandably, great flexibility is achieved by presenting alternative key sets as displayed images of virtual keys bearing the various character sets on the display of the companion device with which the present invention is used. Thus, in FIG. 1B, the virtual keyboard has been eliminated as a guide, further promoting portability and flexibility.
  • In the various embodiments, data (and/or commands) to be input by a user from a virtual keyboard [0040] 50 (as shown in FIG. 1A), or from a work surface 60 devoid of even a virtual keyboard (as shown in FIG. 1B) will be coupled to a companion computer or other system 80. Without limitation, the companion computer system or computer-like system may be a PDA, a wireless telephone, a laptop PC, a pen-based computer, or indeed any other electronic system to which is desired to input data. If a virtual keyboard is used, it preferably may be folded or rolled when not in use. The folded or rolled size may be made sufficiently small to be stored with the PDA or other companion computer system 80, with which it will be used to input data and commands. For example, when folded a keyboard may measure perhaps 2.5″×3″, and preferably at least smaller than say 8″×8″. A virtual keyboard for a PDA might have a folded form factor sized to fit within a pocket at the rear of the PDA. However when in used, the virtual keyboard is unfolded or unrolled to become an essentially full sized albeit virtual keyboard.
  • As the user inputs data into companion system [0041] 80, the display 90 that typically is present on system 80 can display in real-time the data being input 100 from the virtual keyboard, for example, text that might be input to a PDA, e-mail that might be input to a wireless telephone, etc. In one embodiment, a block cursor 102 surrounds a display of the individual alphanumeric character that the invention perceives is about to be typed, the letter “d” in FIG. 1A, for example. This visual feedback feature can help a user confirm accuracy of data entry and perhaps provide guidance in repositioning the user's fingers to ensure the desired character will be typed. Acoustic feedback such as “key clicks” can be emitted by system 80 as each virtual key is pressed to provide further feedback to the user. If desired, passive bumps 107 may be formed in the virtual keyboard to give the user tactile feedback. By way of example, such bumps may be hemispheres formed under each “key” in a virtual keyboard fabricated from a resilient plastic, for example.
  • As noted, visual feedback may also, or instead, be provided by displaying an image of the virtual keyboard (be it a substrate or an empty work surface in front of the companion device) on the screen of the companion device. As the user types, he or she is guided by an image of a keyboard showing the user's fingers as they move relative to the virtual keyboard. This image can include highlighting the keys directly under the user's fingers, and if a key is actually pressed, such key can be highlighted in a different color or contrast. If desired, the screen of the companion device can be “split” such that actual alphanumeric characters appear on the top portion of the screen as they are “typed”, and an image of virtual keys with the user's fingers superimposed appears on the bottom portion of the screen (or vice versa). [0042]
  • In FIG. 1A and FIG. 1B, the companion system [0043] 80 is shown mounted in an cradle 110, to which the three-dimensional sensor 20 may be permanently attached. Alternatively, sensor 20 could be permanently mounted within a preferably lower portion of companion device 80. Output from sensor 20 is coupled via path 120 to a data input port 130 on companion device 80. If a cradle or the like is used, insertion of device 80 into cradle 110 may be used to automatically make the connection between the output of sensor 20 and the input to device 80.
  • As described herein, the configuration of FIG. 1B advantageously permits a user to input data (e.g., text, graphics, commands) to companion device [0044] 80, even without a printed virtual keyboard, such as was shown in FIG. 1A. For ease of understanding, grid lines along the X-axis and Y-axis are shown on a work surface region 60 in front of the companion device 80. Various software mapping techniques, described herein, permit the present invention to discern what virtual keys (if keys were present) the user's fingers intended to strike. Whereas the embodiment of FIG. 1A allowed tactile feedback from a virtual keyboard, the embodiment of FIG. 1B does not. Accordingly it is preferred that screen 90 of device 80 display imagery to assist the user in typing. Of course, as in the embodiment of FIG. 1A, device 80 may emit acoustic key click sounds as the user's fingers press against surface 60 while “typing”.
  • FIG. 1C depicts one sort of visual assistance available from an appropriate device [0045] 80, which assistance may of course be used with the embodiment of FIG. 1A. In FIG. 1C, screen 90 displays at least part of an image of a keyboard 115 and an outline or other representation 40′ of the user's hands, showing hand and finger location relative to where keys would be on an actual or a virtual keyboard. For ease of illustration, FIG. 1C depicts only the location of the user's left hand. As a key is “touched” or the user's finger is sufficiently close to “touching” a key (e.g., location on surface 60 at which such key would be present if a keyboard were present), device 80 can highlight the image of that key (e.g., display the relevant “softkey”), and as the key is “pressed” or “typed upon”, device 80 can highlight the key using a different color or contrast. For example in FIG. 1C, the “Y” key is shown highlighted or contrasted, which can indicate it is being touched or is about to be touched, or it is being pressed by the user's left forefinger. As shown in FIG. 1D, a split screen display can be provided by device 80 in which part of the screen depicts imagery to guide the user's finger placement on a non-existent keyboard, whereas another part of the screen shows data or commands 100 input by the user to device 80. Although FIG. 1D shows text that corresponds to what is being typed, e.g., the letter “Y” in the word “key” is highlighted as spelling of the word “key” on screen 90 is completed, data 100 could instead be a graphic. For example, the user can command device 80 to enter a graphics mode whereupon finger movement across surface 60 (or across a virtual keyboard 70) will produce a graphic, for example, the user's signature “written” with a forefinger or a stylus on surface 60. Collectively, user finger(s) or a stylus may be referred to as a “user digit”.
  • Optionally software associated with the invention (e.g., software [0046] 285 in FIG. 3) can use word context to help reduce “typing” error. Assume the vocabulary of the text in a language being input is known in advance, English for example. Memory in the companion device will store a dictionary containing most frequently used words in the language and as the user “types” a word on a virtual keyboard or indeed in thin air, the companion device software will match letters thus far typed with candidate words from the dictionary. For instance, if the user enters “S”, all words starting with letter “S” are candidates; if the user enters “SU”, all words starting with “SU” are candidates. If the user types “SZ” then, at least in English, there will be no matching candidate word(s). As the user types more letters, the set of candidate words that can match the word being typed reduces to a manageable size. At some threshold point, for instance when the size of the candidate words reduces to 5-10 words, the software can assign a probability to the next letter to be typed by the user. For instance, if the user has entered “SUBJ”, there is a higher probability that the next letter is the letter “E”, rather than say the letter “W”. But since letters “E” and “W” are neighbors on a real or virtual keyboard, it is possible that the user might press the region near the key for the letter “W”. In this example, companion device software can be used to correct the key entry and to assume that the user meant to enter the letter “E”.
  • Turning now to operation of three-dimensional sensor [0047] 20, the sensor emits radiation of a known frequency and detects energy returned by surfaces of objects within the optical field of view. Emitted radiation is shown in FIGS. 1A and 1B as rays 140. Sensor 20 is aimed along the Z-axis to determine which of the user's finger tips 30 touch what portions of template 70, e.g., touch which virtual keys, in what time order. As shown in FIG. 1B, even if template 70 were absent and the user simply typed on the work space in front of device 80, sensor 20 would still function to output meaningful data. In such an embodiment, screen 90 of companion device 80 could display an image 100′ of a keyboard 105 in which “pressed” or underlying “keys” are highlighted, such as key 107, for the letter “T”.
  • As shown in FIGS. 1A and 1B, if desired a light or other projector [0048] 145 that emits visual light beams 147 could be used to project an image of a virtual keyboard to guide the user in typing. For example, a source of visible light (perhaps laser light in a visible wavelength) may be used with diffraction type lenses to project an image to guide the user in typing. In such embodiments, the image of a keyboard, perhaps rendered in a common graphics file format (e.g., GIF) is used to “etch” a diffractive pattern on the lens. Although portions of the projected image would at times fall on the surface of the user's fingers, nonetheless in the absence of a substrate to type upon, such a projected guide can be useful. The use of diffractive optics including such optics as are commercially available from MEMS Optical, LLC of Huntsville, Ala. 35806 may find application in implementing such a projection embodiment.
  • FIG. 3 is a block diagram depicting an exemplary three-dimensional image sensor system [0049] 200 that preferably is fabricated on a single CMOS IC 210. System 200 may be disposed in the same housing as three-dimensional sensor 20, and is used to implement the present invention. As described in greater detail in co-pending U.S. application Ser. No. 09/401,059, incorporated herein by reference, such a system advantageously requires no moving parts and relatively few off-chip components, primarily an light emitting diode (LED) or laser source 220 and associated optical focusing system, and if suitable shielding were provided, one might bond laser source 220 onto the common substrate upon which IC 210 is fabricated. It is to be understood that while the present invention is described with respect to a three-dimensional sensor 20 as disclosed in the above-referenced co-pending U.S. utility patent application, the invention may be practiced with other three-dimensional sensors.
  • System [0050] 200 includes an array 230 of pixel detectors 240, each of which has dedicated circuitry 250 for processing detection charge output by the associated detector. In a virtual keyboard recognition application, array 230 might include 15×100 pixels and a corresponding 15×100 processing circuits 250. Note that the array size is substantially less than required by prior art two-dimensional video systems such as described by Korth. Whereas Korth requires a 4:3 aspect ratio or perhaps in some cases 2:1, the present invention obtains and processes data using an aspect ratio substantially less than 3:1, and preferably about 2:15 or even 1:15. Referring to FIGS. 1A and 1B, it is appreciated that while a relatively large X-axis range must be encompassed, the edge-on disposition of sensor 20 to substrate 50 means that only a relatively small Y-axis distance need be encompassed.
  • During user typing, a high frame rate is required to distinguish between the user's various fingers along a row of virtual keys. However, the back and forth movement of a given typing finger is less rapid in practice. Accordingly the rate of acquisition of Z-axis data may be less than X-axis and Y-axis date, for example 10 frames/second for Z-axis data, and 30 frames/second for X-axis and for Y-axis data. [0051]
  • A practical advantage of a decreased Z-axis frame rate is that less electrical current is required by the present invention in obtaining keyboard finger position information. Indeed, in signal processing acquired information, the present invention can average Z-axis information over frames, for example examining one-third of the frames for Z-axis position information. Acquired Z-axis values will have noise or jitter that can be reduced by averaging. For example Z-values may be averaged over three successive thirty frame/second frames such that three consecutive image frames will share the same processed Z-values. While the effective frame rate for Z-values is lowered to one-third the acquisition rate for X-axis and Y-axis data acquisition, accuracy of the Z data is improved by averaging out the noise or jitter. The resultant decreased Z-axis frame rate is still sufficiently rapid to acquire meaningful information. This use of different frame rates for X-values and Y-values, versus Z-values is useful to the present invention. For example, a reduced acquisition rate of Z-axis data relative to X-axis and Y-axis data minimizes electrical current drain, and avoids taxing the signal processor (CPU [0052] 260) with redundant signal processing.
  • Thus, the present invention acquires three-dimensional image data without requiring ambient light, whereas prior art Korth-like systems acquire two-dimensional luminosity data in the presence of ambient light. In essence, the present invention can sense three-dimensionally objects, e.g., fingers and substrate, analogously to a human's feeling an object by touching. Advantageously, this can be accomplished using relatively small operating power, e.g., perhaps 3.3 VDC at 10 mW, which permits the present invention to be battery operated and fabricated in a relatively small and mobile form factor. [0053]
  • Multiple frames per second of three-dimensional image data of the user's hands and fingers and the substrate are available from array [0054] 230. Using this data the present invention constructs a three-dimensional image of the hands and fingers relative to the substrate, or if the substrate is absent, relative to where virtual keys would be if a keyboard were on the work surface in front of the companion device 80. Exemplary techniques for doing so are described in applicant Bamji's earlier referenced co-pending U.S. patent application. Constructing such a three-dimensional image from time-of-flight data is superior to prior art methods that attempt to guess at spatial relationships using two-dimensional luminosity based data, e.g., as suggested by Korth. It should be noted that time of flight methods may include return pulse time measurement, phase or frequency detection, or a high speed shutter method, as described in the Bamji patent application. Other methods that do not rely on time-of-flight can capture three-dimensional data, including stereo imagery, and luminosity-based techniques that discern depth from reflective intensity.
  • In practice, array [0055] 230 can acquire and generate data at 30 frames/second, a frame rate sufficient to process virtual typing of 5 characters/second, which is about 60 words/minute. If array 230 is rectangular, e.g., comprising a number n of X-axis pixels and a number m Y-axis pixels, if n=100 and m=15, then a grid comprising 1,500 pixels is formed. For each frame of data, each pixel in array 230 will have a value representing the vector distance from sensor 20 to the surface of the object (e.g., a portion of a user's finger, a portion of the substrate, etc.) captured by that pixel, e.g., a vector or Z-value. This data is far more useful than Korth's luminosity-based image data that at best provided video frames with RGB grey or color scale values in determining the contour of a user's fingers and location on a virtual keyboard, in two dimensions.
  • Use of acquired three-dimensional data permits software [0056] 285 to determine the actual shape of the user's fingers (nominally assumed to be somewhat cylindrical), and thus relative finger position with respect to other fingers, to location over or on the substrate, and relative to three-dimensional sensor 20. In FIG. 1A, for example, as a finger is sensed to be moving to a Y=0 position, it can be determined that the finger is probably preparing to type a virtual key. If that finger is also sensed to be approaching the Z=Z1 region, then that finger is probably prepared to type a virtual key in the first row of keys on the virtual keyboard. Determination of whether a virtual key is about to be pressed also takes into account velocity data. For example, a user finger detected to be moving rapidly downward toward Y=0 is probably getting ready to strike a virtual key.
  • In FIG. 3, IC [0057] 210 will also include a microprocessor or microcontroller unit 260 (denoted CPU), random access memory 270 (RAM) and read-only memory 280 (ROM), a portion of which ROM preferably holds a software routine 285 executable by the CPU to implement the present invention. Controller unit 260 preferably is a 16-bit RISC microprocessor operating at perhaps 50 MHz. Among other functions, CPU 260 performs vector distance to object and object velocity calculations, where the object is the substrate and user's hands. IC 210 further includes a high speed distributable clock 290, and various computing, optical drive input/output (I/O) circuitry 300, and interface data/command input/output (I/O) circuitry 310. Digital keyboard scan type data or digitizer tablet/mouse type data is output from I/O 310, for example from COM and/or USB type ports associated with system 200.
  • Preferably the two-dimensional array [0058] 230 of pixel sensing detectors is fabricated using standard commercial silicon technology, which advantageously permits fabricating circuits 250, 260, 270, 280, 290, and 300 on the same IC 210. Understandably, the ability to fabricate such circuits on the same IC with the array of pixel detectors can shorten processing and delay times, due to shorter signal paths.
  • Each pixel detector may be represented as a parallel combination of a current source, an ideal diode, and shunt impedance and noise current source. Each pixel detector will output a current proportional to the amount of incoming photon light energy falling upon it. Preferably CMOS fabrication is used to implement the array of CMOS pixel diodes or photogate detector devices. For example photodiodes may be fabricated using a diffusion-to-well, or a well-to-substrate junction. Well-to-substrate photodiodes are more sensitive to infrared (IR) light, exhibit less capacitance, and are thus preferred. [0059]
  • As shown in FIGS. 3 and 4, a circuit [0060] 250 is associated with each pixel detector 240. Each circuit 250 preferably includes a pulse peak detector 310, a high speed counter 320, and has access to the high speed clock 290. Preferably formed on IC 210, high speed clock 200 outputs a continuous train of high frequency clock pulses preferably at a fixed frequency of perhaps 500 MHz, preferably with a low duty cycle as the pulses are output. Of course, other high speed clock parameters could instead be used. This pulse train is coupled to the input port of each high speed interpolating counter 320. Counter 320 preferably can sub-count, as described in the Bamji pending patent application, and can resolve times on the order of 70 ps. Preferably each counter 320 also has a port to receive a START signal (e.g., start now to count), a port to receive a STOP signal (e.g., stop counting now), and a port to receive a CONTROL signal (e.g., reset accumulated count now). The CONTROL and START signals are available from controller 260, the CLOCK signal is available from clock unit 290, and the STOP signal is available from pulse peak detector 310.
  • Virtual keyboard [0061] 50 will be placed perhaps 20 cm distant from three-dimensional sensor 20, substantially in the same plane as the sensor lens. Since a typical sensor lens angle is perhaps 60°, a 20 cm distance ensures optical coverage of the virtual keyboard. In FIG. 3, for ease of illustration the distance between sensor 20 light emissions and collected light has been exaggerated.
  • In overview, system [0062] 200 operates as follows. At time t0, microprocessor 260 commands light source 220 to emit a pulse of light of known wavelength, which passes through focus lens 288′ and travels at the speed of light (C), 300,000 km/sec. toward objects of interest, e.g., substrate 50 and user's fingers 30. If light source 220 is sufficiently powerful, lens 288′ may be dispensed with. At the surface of the object being imaged at least some of the light may be reflected back toward system 200 to be sensed by the detector array. In FIG. 3, the objects of interest are the fingers 30 of a user's hand, and, if present, substrate 50, which as noted may include viewable indicia such as keyboard keys 70 or perhaps projected grid lines, to guide the user in finger placement while “typing”.
  • As was indicated by FIG. 1A, the position of virtual keys [0063] 70 (or other user available indicia) on substrate 50 is known in two dimensions on the X-Z plane relative to the position of other such keys on the substrate. As the user's fingers move back and forth over substrate 50, touching virtual keys 70 while “typing”, it is a function of CPU 260 and software routine 285 to examine return optical energy to identify which, if any, virtual keys are being touched by the user's fingers at what times. Once this information is obtained, appropriate KEYUP, KEYDOWN, and key scan code or other output signals may be provided to input port 130 of the companion device 80, just as though the data or commands being provided were generated by an actual keyboard or other input device.
  • At or before time t0, each pixel counter [0064] 310 in array 230 receives a CONTROL signal from controller 260, which resets any count previously held in the counter. At time t0, controller 260 issues a START command to each counter, whereupon each counter begins to count and accumulate CLOCK pulses from clock 290. During the roundtrip time of flight (TOF) of a light pulse, each counter accumulates CLOCK pulses, with a larger number of accumulated clock pulses representing longer TOF, which is to say, greater distance between a light reflecting point on the imaged object and system 200.
  • The fundamental nature of focus lens [0065] 288 associated with system 200 is such that reflected light from a point on the surface of imaged object 20 will only fall upon the pixel in the array focused upon such point. Thus, at time t1, photon light energy reflected from the closest point on the surface of object 20 will pass through a lens/filter 288 and will fall upon the pixel detector 240 in array 230 focused upon that point. A filter associated with lens 288 ensures that only incoming light have the wavelength emitted by light source 220 falls upon the detector array unattenuated.
  • Assume that one particular pixel detector [0066] 240 within array 230 is focused upon a nearest surface point on the tip 70 of the nearest user's finger. The associated detector 300 will detect voltage that is output by the pixel detector in response to the incoming photon energy from such object point. Preferably pulse detector 300 is implemented as an amplifying peak detector that senses a small but rapid change in pixel output current or voltage. When the rapidly changing output voltage is sufficiently large to be detected, logic within detector 300 (e.g., an SR flipflop) toggles to latch the output pulse, which is provided as the STOP signal to the associated counter 320. Thus, the number of counts accumulated within the associated counter 320 will be indicative of roundtrip TOF to the near portion of the fingertip in question, a calculable distance Z1 away.
  • Distance Z1 may be determined from the following relationship in which C is the velocity of light: [0067]
  • Z1=(t1−t0)/2
  • At some later time t2 photon energy will arrive at lens [0068] 288 from a somewhat more distant portion of the user's fingertip, 30, and will fall upon array 230 and be detected by another pixel detector. Hitherto the counter associated with this other detector has continued to count CLOCK pulses starting from time t0, as indeed have all counters except for the counter that stopped counting at time t1. At time t2, the pulse detector associated with the pixel just now receiving and detecting incoming photon energy will issue a STOP command to the associated counter. The accumulated count in this counter will reflect roundtrip TOF to the intermediate point on the fingertip, a distance Z2 away. Within IC 210, controller 260 executing software routine 285 stored in memory 280 can calculate distance associated with the TOF data for each light reflecting point on the object surface. Velocity can be calculated by examining successive frames of acquired data.
  • In similar fashion, at time t3 yet another pixel detector in the array will detect sufficient just-arriving photon energy for its associated pulse detector [0069] 300 to issue a STOP command to the associated counter. The accumulated count in this counter represents TOF data for a still farther distance Z3 to the imaged object. Although for ease of illustration FIG. 3 shows but three emitted light rays and light reflections, all falling near one fingertip, in practice substantially all of the substrate and user's fingers and thumbs will be subjected to illumination from light source 220, and will reflect at least some energy into lens 288 associated with three-dimensional sensor 20.
  • Some pixels in the array may of course not receive sufficient reflected light from the object point upon which they are focused. Thus, after a predetermined amount of time (that may be programmed into controller [0070] 260), the counter associated with each pixel in the sensor array will have been stopped due to pulse detection (or will be assumed to hold a count corresponding to a target at distance Z=infinity).
  • As noted, in the present application it suffices if system [0071] 200 can accurately image objects within a range of perhaps 20 cm to 30 cm, e.g., about 20 cm plus the distance separating the top and the bottom “row” of virtual keys on substrate 50. With each detected reflected light pulse, the counter-calculated TOF distance value for each pixel in the array is determined and preferably stored in a frame buffer in RAM associated with unit 270. Preferably microprocessor 260 examines consecutive frames stored in RAM to identify objects and object location in the field of view. Microprocessor 260 can then compute object, e.g., finger movement velocity. In addition to calculating distance and velocity, the microprocessor and associated on-chip circuitry preferably are programmed to recognize the outline or contours of the user's fingers, and to distinguish the finger surfaces from the substrate surface. Once the finger contours are identified, system 200 can output via a COM or USB or other port relevant digital data and commands to the companion computer system.
  • The above example described how three pixel detectors receiving photon energies at three separate times t1, t2, t3 turn-off associated counters whose accumulated counts could be used to calculate distances Z1, Z2, Z3 to finger surfaces and the substrate in the field of view. In practice, the present invention will process not three but thousands or tens of thousands of such calculations per each light pulse, depending upon the size of the array. Such processing can occur on IC chip [0072] 210, for example using microprocessor 260 to execute routine 285 stored (or storable) in ROM 280. Each of the pixel detectors in the array will have unique position locations on the detection array, and the count output from the high speed counter associated with each pixel detector can be uniquely identified. Thus, TOF data gathered by two-dimensional detection array 230 may be signal processed to provide accurate distances to three-dimensional object surfaces, such as a user's fingers and a substrate. It will be appreciated that output from CMOS-compatible detectors 240 may be accessed in a random manner if desired, which permits outputting TOF DATA in any order.
  • Light source [0073] 220 is preferably an LED or a laser that emits energy with a wavelength of perhaps 800 nm, although other wavelengths could instead be used. Below 800 nm wavelength, emitted light starts to become visible and laser efficiency is reduced. Above 900 nm CMOS sensor efficiency drops off rapidly, and in any event, 1100 nm is the upper wavelength for a device fabricated on a silicon substrate, such as IC 210. As noted, by emitted light pulses having a specific wavelength, and by filtering out incoming light of different wavelength, system 200 is operable with or without ambient light. If substrate 50 contained, for example, raised ridges defining the outlines of virtual keys, a user can literally type in the dark and system 200 would still function properly. This ability to function without dependence upon ambient light is in stark contrast to prior art schemes such as described by Korth. As noted, even for users who are not accomplished touch typists, the present invention may be used in the dark by providing an image of a virtual keyboard on the display of companion device 80.
  • As noted, lens [0074] 288 preferably focuses filtered incoming light energy onto sensor array 230 such that each pixel in the array receives light from only one particular point (e.g., an object surface point) in the field of view. The properties of light wave propagation allow an ordinary lens 288 to be used to focus the light onto the sensor array. If a lens is required to focus the emitted light, a single lens could be used for 288, 288′ if a mirror-type arrangement were used.
  • In practical applications, sensor array [0075] 230 preferably has sufficient resolution to differentiate target distances on the order of about 1 cm, which implies each pixel must be able to resolve time differences on the order of about 70 ps (e.g., 1 cm/C). In terms of a CMOS-implemented system specification, high speed counters 320 must be able to resolve time to within about 70 ps, and peak pulse detectors 310 must be low-noise high speed units also able to resolve about 70 ps (after averaging about 100 samples) with a detection sensitivity on the order of perhaps a few hundred microvolts (μV). Accurate distance measurements will require that the pulse detector response time be removed from the total elapsed time. Finally, the CLOCK signal output by circuit 280 should have a period on the order of about 2 ns.
  • As noted above, each interpolating counter [0076] 320 preferably can resolve distances on the order of 1 cm, which implies resolving time to the order of about 70 ps. Using a 10-bit counter with an effective 70 ps cycle time would yield a maximum system detection distance of about 10 m (e.g., 1,024 cm). Implementing an ordinary 10-bit counter would typically require a worst case path of perhaps 40 gates, each of which would require typically 200 ps, for a total propagation time of perhaps about 8 ns. This in turn would limit the fastest system clock cycle time to about 10 ns. Using carry look-ahead hardware might, at a cost, reduce counter propagation time, but nonetheless a 2 ns system cycle time would be quite difficult to implement.
  • To achieve the required cycle time, a so-called pseudo random sequence counter (PRSC), sometimes termed a linear shift register (LSR), may be used. Details for implementing high speed counters including PRSC units may be found in applicant's earlier-referenced co-pending utility patent application. [0077]
  • Considerations involved in recognizing contour of the user's fingers within the optical field of view will now be described with reference to FIG. 5, which depicts a cross-section of two of the user's fingers. The + symbols show sub-frame (intra-frame) samples of vector distance values for each pixel sensor in array [0078] 210 imaging the fingers. Inherent noise associated with the pixel sensors produces varying vector distances to the same point of the imaged finger object in each acquired sample. To reduce noise and improve signal/noise, the sensor averages out measurements for each pixel to produce average values for the frame, shown by the ◯ symbol in FIG. 5. The □ symbol in FIG. 5 represents the corrected average when a template, or set of stored exemplary finger-shaped cross-sections, is used by routine 285 to interpret the average values. This method enhances distance measurement accuracy, and reduces ambiguity in recognizing the user's fingers.
  • Data capture noise can affect the minimum frame rate needed to recognize the user's fingers and determine finger motion and velocity. In TOF-based imagery, as used in the present invention, pixel-level noise manifests itself as variations in distance values for a given pixel, from one frame to another frame, even if the imaged object remains stationary. [0079]
  • For ease of illustration, the keyboard images depicted in FIGS. 1A and 2A, [0080] 2B were drawn as a matrix, e.g., uniform rows and columns. But in practice, as shown partially in FIG. 6, standard QWERTY-type keyboards (and indeed keyboards with other key configurations) are laid out in an offset or staggered configuration. The present invention advantageously reduces the requirement for Z-axis resolution by taking into account the staggering of actual keyboard layouts. Thus, the second row from the top of a keyboard is shifted slightly to the right, and the third row (from the top) is shifted further to the right, and so on. This staggering places the keys in each row at an offset position with respect to the keys in the adjacent row. By way of example, note the keyboard letter “G” in FIG. 6. Dotted rectangle 400 indicates allowable latitude given a user in striking the letter “G”, e.g., any virtual contact within the rectangle area will unambiguously be interpreted as user finger contact on the letter “G”. The height of this rectangle, denoted by Z is the maximum error margin allowed in detecting a Z-axis coordinate. Note that this margin is greater than the height of a single row R in a QWERTY keyboard. It is also noted that the region of recognition for a key need not be rectangular, and may be of any reasonable shape, for example, an ellipse centered at the key.
  • As acquired frames of three-dimensional data become available to CPU [0081] 270 and to routine 285, recognition of the user's fingers from the acquired data proceeds. This task is simplified in that the data indeed includes a three-dimensional representation of the user's fingers, and the fingers will have a reasonably well known shape, e.g., when viewed edge-on, they are somewhat cylindrical in shape. As noted, storing exemplary templates of finger shapes and finger and hand heuristics in memory 280 expedites finger recognition by reducing CPU time needed to recognize and track finger positions. Such signal processing can quickly reduce data capture noise and more readily discern the user's fingers from among the three-dimensional data acquired. Signal to noise ratio can also be improved in intra-frame states in that knowledge of the scene being imaged is known, e.g., the scene comprises a virtual keyboard and user's hands. Preferably a few hundred data captures are averaged or otherwise used to construct a frame of acquired data.
  • Once the user's fingers are recognized, software routine [0082] 285 (or an equivalent routine, perhaps executed by other than CPU 260, can next determine position and motion (e.g., relative change of position per unit time) of the fingers. Since data representing the fingers are in three dimensions, routine 285 can readily eliminate background images and focus only on the user hands. In a Korth two-dimensional imaging scheme, this task is very difficult as the shape and movement of background objects (e.g., a user's sleeve, arm, body, chair contour, etc.) can confuse object tracking and recognition software routines.
  • Using contour of the finger tips, routine [0083] 285 uses Z-axis distance measurements to determine position of the fingers with respect to the rows of the virtual keyboard, e.g., distance Z1 or Z2 in FIG. 1A. As noted, the granularity of such axis measurements is substantially greater than what is depicted in FIG. 1A. X-axis distance measurements provide data as to fingertip position with respect to the columns of the virtual keyboard. Using row and column co-ordinate numbers, software 285 can determine the actual virtual key touched by each finger, e.g., key “T” by the left forefinger in FIG. 1A.
  • To help the user orient the fingers on a particular virtual input device such as a keyboard, numeric pad, telephone pad, etc., software within the companion device [0084] 80 can be used to display a soft keyboard on a screen 90 associated with the device (e.g., a PDA or cellular telephone screen), or on a display terminal coupled to device 80. The soft keyboard image will show user finger positions for all keys on (or close to) virtual keyboard 50, for example by highlighting keys directly under the user's fingers. When a key is actually struck (as perceived by the user's finger movement), the struck key may be highlighted using a different color or contrast. If the virtual keys are not in a correct rest position, the user can command the companion device to position the virtual keyboard or other input device in the proper starting position. For instance, if the user typically begins to key by placing the right hand fingers on home row J, K, L, and “:” keys, and the left fingers on F, D, S and A keys, the software will move the keys of the virtual keyboard to such a position.
  • Vertical Y-axis motion of the user's fingers is sensed to determine what virtual keys on device [0085] 50 are being typed upon, or struck. While typing on a mechanical keyboard several fingers may be in motion simultaneously, but normally only one finger strikes a key, absent double key entries such pressing the CONTROL key and perhaps the “P” key, or absent a typographical error. In the present invention, software routine 285 determines finger motion information from successive frames of acquired information.
  • Advantageously the human hand imposes certain restrictions upon finger motion, which restrictions are adopted in modeling an image of the user's hands and fingers. For example, a connectiveness property of the fingers imposes certain coupling between movement of the fingers. The degree of freedom at the finger joints gives certain freedom to each finger to move, for example to move nearer or further from other fingers. Routine [0086] 285 advantageously can employ several heuristics to determine what virtual key is actually being struck. For instance, a keystroke can be sensed as commencing with a detected finger up movement followed by a quick finger down motion. A user's finger having the smallest Y-axis position or the greatest downward velocity is selected as the key entry finger, e.g., the finger that will strike one of the virtual keys on the virtual data input device.
  • Unintended key entry by a user is discerned by intelligently monitoring movement of the user's fingers. For example, the user may rest the fingers on a surface of substrate [0087] 50 without triggering unintended key entries. This is analogous to a condition where a typist using a mechanical keyboard rests his or her fingers on the keys without pressing any key sufficiently hard to type. A user of the present invention is also permitted to move his or her fingers gently over the virtual keyboard without unintentional triggering any key. Software 285 can calibrate its operation such that only intentional gestures are admitted as valid key entry to input data or commands to the companion computer device 80.
  • Software [0088] 285 upon execution by a CPU such as CPU 270 may be used to implement an algorithm or routine to recognize what virtual keys are being typed upon by a user of the present invention. Input data for the algorithm is three-dimensional optical information obtained from sensor 20. An exemplary algorithm may be considered as having three phases: building and personalizing templates, calibration, and actually tracking user typing on a virtual keyboard or work surface. In the description that follows it will be assumed that normal typing is undertaken in which all fingers are used. For instances where one or two fingers only are used, a special case of the algorithm will apply.
  • Templates are understood to be predefined models of different typing posture for different users. This class of templates is based upon analysis of a population of system users, whose various typing styles will have been classified. It is to be noted that the templates may be derived from examples of input data (e.g examples of data collected by observing fingers in typing position) or from a preprogrammed mathematical description of the geometrical properties of the objects to be tracked (e.g. a cylindrical description for fingers). The resultant templates may be created at the time ROM [0089] 280 and especially routine 285 is fabricated. Since the position and shape of keyboard keys imposes certain commonalities of style upon users, it will be appreciated that the number of predefined templates need not be excessively large.
  • Preferably individual users of the present invention can also construct their own dedicated templates using a training tool that guides the user through the steps needed to build a template. For instance, a training program portion of software [0090] 285 can present on display 90 commands telling the user to place his or her fingers in typing position on the virtual keyboard, if present, or the work surface in front of the companion device 80. The training program will then tell the user to repeatedly press a virtual key under each finger. Optically capturing thumb movement can be treated as a special case since thumb movement differs from finger movement and typically is restricted to repressing the space bar region of a virtual keyboard or work surface.
  • In building the template, it is desired to construct a classification of the objects in the template image as being the different fingers of the user's hands. As described in further detail following, this method step collects information for the classifier or algorithm routine as to the physical properties of the user's hand. Later, during actual typing, the classifier uses this template to quickly map image in acquired frames to each user's fingers. As part of the template construction, preferably a mapping of the positions of the user's fingers to specific keyboard keys at a rest position is defined. For instance, routine [0091] 285 and CPU 270 can instruct the companion device 80 that, at rest, the user's left hand fingers touch the: “A”, “S”, “D” and “F” keys, and the user's right hand fingers touch the “J”, “K”, “L”, and “:” keys. Such method step personalizes the virtual keyboard to the style of a particular user. This personalization process is carried out once and need not be repeated unless the user's typing posture changes substantially to where too many wrong keys are being identified as having been typed upon. A calibration process according to the present invention may be carried out as follows. At the start of a typing session, the user will so signal the companion device 80 by putting the application being run by device 80 in a text input mode. For example, if device 80 is a PDA, the user can touch a text field displayed on screen 80 with a stylus or finger, thereby setting the input focus of the companion 80 application to a text field. Other companion devices may be set to the appropriate text input mode using procedures associated with such devices.
  • Next the user's fingers are placed in a typing position in the work surface in front of three-dimensional sensor [0092] 20, either on a virtual keyboard or simply on the work surface. This step is used to map the user fingers to the elements of the template and to calibrate the user's fingers to the keys of the virtual keyboard (or work surface) before a typing session starts.
  • At this juncture, three-dimensional sensor [0093] 20 will be repeatedly capturing the contour map of the user's fingers. The data thus captured will be placed, e.g., by software 285 in a table or matrix such as shown in FIGS. 7A-7O.
  • FIG. 7A depicts a user's left hand typing on an actual keyboard, as imaged by sensor [0094] 20. The field of view (FOV) of sensor 20 is intentionally directed toward the upper work surface, which in this example was an actual keyboard. Five fingers of the left hand are shown, and may be identified as fingers 1 (thumb), 2, 3, 4, and 5 (little finger). The cross-hatched region behind and between the fingers indicates regions too dark to be considered part of the user's fingers by the present invention. In an actual setting, there would of course be varying degrees of darkness, rather than the uniform dark region shown here for ease of understanding, and of depiction.
  • An overlay grid-like matrix or table is shown in FIG. 7A, in which various regions have quantized digits representing a normalized vector distance between the relevant surface portion of a user's finger and sensor [0095] 20. It is understood that these quantized distance values are dynamically calculated by the present invention, for example by software 285. In the mapping shown in FIG. 7A, low digit values such as 1, 2, represent close distances, and higher values such as 7, 8 represent large distances. The “d” values represent perceived discontinuities. Depending on the technology associated with sensor 20, values of “d” may oscillate widely and can indicate the absence of a foreground object. In FIG. 7A, the quantized distance values indicate that the user's left thumb is farther away from sensor 20 (as indicated by relatively high distance values of 7 and 8) than is the user's left forefinger, whose distance values are relatively low, e.g., 1. It is also seen that the user's left little finger is in generally more distance from sensor 20 than is the user's forefinger.
  • The central portion of FIG. 7A is a table or matrix showing the normalized distance values and, where applicable, “d” entries. A similar table is also shown in FIGS. [0096] 7B-7O. The table entries can represent contours of user fingers, and shading has been added to these tables to assist in showing potential mapping of distance data to an outline of the user's fingers. Arrows from the FOV portion of FIG. 7A pointing to columns in the table indicate how various columns of data can indeed represent contours of user finger position. In the tables shown in FIGS. 7A-7O, circled numbers “1”, “2” . . . “5” depict contours corresponding to perceived location of the users left thumb (finger “1”), forefinger, middle finger, ring finger, and little finger (finger “5”) respectively.
  • As described earlier, templates preferably are used in the present invention to help identify user finger positions from data obtained from sensor [0097] 20. Templates can assist classification algorithm (or classifier) 285 in distinguishing boundaries between fingers when discontinuities are not necessarily apparent. For example, in FIG. 7A, the third and fourth user's fingers (fingers 3 and 4) are relatively close together.
  • Shown at the bottom of FIG. 7A is a dynamic display of what the user is typing, based upon analysis by the present invention of the sensor-perceived distance values, dynamic velocity values, as well as heuristics associated with the overall task of recognizing what keys (real or virtual) are being pressed at what time. Thus, at the moment captured in FIG. 7A, the user's left forefinger (finger [0098] 2) appears to have just typed the letter “f”, perhaps in the sentence “The quick brown fox jumped over the lazy dog”, as the partially typed phrase 100 might appear on display 90 of a companion device 80.
  • Preferably the calibration phase of software routine [0099] 285 is user-friendly. Accordingly, routine 285 in essence moves or relocates the virtual keyboard to under the user's fingers. Such procedure may be carried out by mapping the image obtained from sensor 20 to the fingers of the template, and then mapping the touched keys to the natural position for the user, which natural position was determined during the template construction phase.
  • The calibration step defines an initial state or rest position, and maps the user's fingers at rest position to specific keys on the keyboard. As shown in FIG. 1B, the “keys” [0100] 107 that are touched or very nearby (but not pressed) preferably are highlighted on a soft-keyboard 105 displayed on screen 90 of companion device 80, assuming of course that a screen 90 is available. This rest position will also be the position that the user's fingers assume at the end of a typing burst.
  • During actual typing, routine [0101] 285 senses the user's fingers and maps finger movements to correct keys on a virtual keyboard. Before starting this phase of the algorithm, the relevant companion device 80 application will have been put into text input mode and will be ready to accept keyboard events (e.g. KEYUP and KEYDOWN).
  • Routine [0102] 285 (or equivalent) may be implemented in many ways. In the preferred embodiment, routine 285 will use three modules. A “classifier” module is used to map clusters in each frame to user fingers. A “tracker” module is used to track movement of active fingers by searching for a key stroke finger motion and by determining coordinates of the point of impact between the user's finger and a location on a virtual keyboard or other work surface. A third “mapper” module maps the impact point of a user finger to a specific key on the virtual keyboard and sends a key event to the companion device 80. These exemplary modules will now be described in further detail.
  • The role of the classifier module is to make sense of the contour map of the scene generated by sensor [0103] 20 at each frame of optically acquired data. The cluster module will identify clusters that have certain common properties such as being part of the same surface. Importantly, the classifier will label each cluster so that the same cluster can be identified from other clusters in successive frames of acquired data. The classifier also determines the boundaries of each cluster, and specifically determines the tip of each cluster, which tip maps to the tip of user fingers. The goal is not recognition of user fingers per se, in that for all intent and purpose the user could be holding a stick or stylus that is used to press virtual keys or virtual locations of keys. Thus the above-described template is used primarily to give meaning to these clusters and to assist in forming the clusters.
  • One method of clustering or locating clusters is to use a nearest neighbor condition to form nearest neighbor partitions, in which each partition maps to each finger of the user. Such mapping would result in five partitions for the user's left hand, and five partitions for the user's right hand, in which left hand and right hand partitions can be treated separately. [0104]
  • One method of partition formation is based on Llyod's algorithm. Details of this algorithm, which is well known in the field of image processing, may be found in the text [0105] Vector Quantization and Signal Compression by Allen Gersho and Robert Gray, see page 362. By way of example, let Ct={ci; i=1, . . . 5} be the set of partitions for one hand. In each partition a set of points Pi,t={r: d(r, ci)<d(r,cj); for all j< >i} is defined, in which function do is a measure of the distance between two points in the set. If d(r, ci)=d(r,cj), the “tie” can be broken by placing the point in the set with a lower index. For two points a and b, d(a,b) can be defined as (xa−xb)2+(ya−yb)2+(za−zb)2, where x, y and z are the axis-measurements obtained from sensor 20. A function center(Pi,t) can be defined as the center of gravity or centroid of the points in Pi,t. Next define Ct+1={center(Pi,t); i=1, . . . 5}. Using the new centroids, Pi,t+1 can be found, as above. Iteratation is continued (e.g., by routine 285 or equivalent) until the membership of the two successive Pi sets remain unchanged. Typically, the iteration converges in 3-4 iterations, and points in the final set Pi are the clusters of points for each user finger. In this method, the ultimate goal of the classifier is not recognition of user fingers per se, but rather to determine which key was struck by a user finger. This observation enables the classifier to tolerate clustering inaccuracies in the periphery of a typing region that do not impact the performance of the system.
  • The tracker module will now be more fully described with respect to the matrices shown in FIGS. [0106] 7A-7O, in which the clusters are shaded as an aide to visually understanding the data. Perceived clusters are preferably input to a tracker module that will keep track of the movement of each cluster. The tracker module is especially alert for relatively rapid up and down movements, and will compute velocities and directions of the clusters.
  • FIGS. [0107] 7D-7K depict matrix tables showing a sequence of images obtained as the user's second finger rises upward and then moves downward to strike at a (virtual) key beneath the end of the finger. Preferably the tip of each cluster that is closely monitored by the tracker module will have been identified by the classifier module. In actual images, other user fingers may also move slightly, but in the example being described, the classifier determines that the rate of acceleration of the left forefinger (finger 2) is noticeably higher than the movements of the other fingers.
  • In FIGS. [0108] 7D-7E, a pointing arrow is added to show the direction and the tip of the perceived cluster (e.g., user finger). Cluster or finger movement is upward in FIGS. 7D-7F, with FIG. 7F representing a maximum upward position of the user's finger, e.g., a maximum Y-axis location as determined by sensor 20 acquired data. In FIGS. 7G-7H, the cluster or finger is now moving downward, e.g., toward the virtual keyboard 50 or work surface 60. In FIG. 71, contact of the user's finger with a virtual key or key location on a work surface is perceived.
  • Vertical velocity of a finger tip may be computed by routine [0109] 285 (or other routine) in several ways. In a preferred embodiment, the tracker module computes vertical velocity of a user's fingertip (identified by the classifier) by dividing the difference between the highest and the lowest position of the fingertip by the number of frames acquired during the sequence. The velocity is computed in terms of Y-axis resolution by number of frames, which is independent of the frame rate per second. To register a key strike, this computed Y-axis velocity must be equal or higher than a threshold velocity. The threshold velocity is a parameter that used by software 285, and preferably is user-adjustable during the personalization step.
  • FIGS. [0110] 7J-7O depict matrix tables in which a more complex sequence showing movement of the user's left forefinger (finger 2) in a down-and-back direction. In FIG. 7O, this finger motion is shown culminating in a key stroke on a key in the first row of the virtual keyboard (or location on a work surface in front of device 80 where such virtual key would otherwise be found).
  • Referring now to the mapper module, the tracker module will signal the mapper module when it determines that a keystroke has been detected, and the tracker module passes the cluster tip (X,Y,Z) coordinates of the cluster tip. The mapper module uses the Z-axis value to determine the row location on the virtual keyboard, and uses the X-axis and Y-axis values to determine the key within the row. Referring for example to FIG. 1A, a coordinate (X,Y,Z) location (7,0,3) might signify the letter “T” on a virtual keyboard. Again it is understood that the various modules preferably comprise portions of software routine [0111] 285, although other routines including routines executed other than by CPU 285 may instead be used.
  • Modifications and variations may be made to the disclosed embodiments without departing from the subject and spirit of the invention as defined by the following claims. For example, if desired more than one sensor may be employed to acquire three-dimensional position information. [0112]

Claims (42)

    What is claimed is:
  1. 1. A system to enable a user to interact with a virtual input device using a user-controlled object, the system comprising:
    a single sensor system that acquires data representing a single image at a given time, from which data three-dimensional coordinate information of a relevant position of at least a portion of said user-controlled object may be determined such that a location defined on said virtual input device contacted by said user-controlled object is identifiable; and
    a processor system to determine whether a portion of said user-controlled object contacted a location defined on said virtual input device, and if contacted to determine what function of said virtual input device is associated with said location:
    wherein said system determines if, when in time, and where interaction between said user-controlled object and said virtual input device occurs.
  2. 2. The system of claim 1, further including:
    means for making available to a companion system information commensurate with contact location determined by said processor system, said companion system including at least one device selected from a group consisting of (i) a PDA, (ii) a wireless telephone, (iii) a cellular telephone, (iv) a set-top box, (v) a mobile electronic device, (vi) an electronic device, (vii) a computer, (viii) an appliance adapted to accept input information, and (ix) an electronic system;
    wherein by controlling said user-controlled object a user interacts with said virtual input device to provide information to said companion system.
  3. 3. The system of claim 1, wherein said single sensor system acquires said data using time-of-flight from said single sensor system to a portion of said user-controlled object.
  4. 4. The system of claim 1, further including feedback to guide said user in positioning said user-controlled object with respect to said virtual input device, said feedback including at least one type of feedback selected from a group consisting of (i) audible feedback, (ii) audible feedback representing information input by said user-controlled object, (iii) audible feedback representing proximity of said user-controlled object to said virtual input device, (iv) audible feedback representing contact location of said user-controlled object on said virtual input device, (v) visual feedback, (vi) visual feedback representing information input by said user-controlled object, (vii) visual feedback including a display representing proximity of said user-controlled object to said virtual input device, and (viii) visual feedback including a display representing contact location of said user-controlled object with said virtual input device.
  5. 5. The system of claim 1, wherein said virtual input device is a keyboard, and further including feedback to guide said user in positioning said user-controlled object with respect to said keyboard, said feedback including at least one type of feedback selected from a group consisting of (i) audible feedback, (ii) audible enunciation of each virtual key's name when said virtual key contacted by said user-controlled object, (iii) an audible key click sound when a virtual key is contacted by said user-controlled object, (iv) an audible key click sound whose sound varies with mode of operation of a virtual key contacted by said user-controlled object, (v) a display of visual feedback, (vi) a display of visual feedback representing at least one key on said keyboard, (vii) a display of visual feedback representing at least one key on said keyboard and at least a portion of said user-controlled object, (viii) a display of visual feedback representing at least two keys on said keyboard keys wherein a key on said keyboard contacted by said user-controlled object is visually distinguishable from adjacent keys on said keyboard, (ix) a display of visual feedback representing information input by said user-controlled object, and (vii) a display of visual feedback representing an image whose position signifies position of said user-object relative to a virtual key when said virtual input device is a virtual keyboard, and wherein size of said image signifies distance from a lower surface of said user-object to said virtual keyboard.
  6. 6. The system of claim 1, wherein said virtual input device is a keyboard, and further including a language routine that selects most likely user-intended keystrokes as said user interacts with said keyboard based upon knowledge of language used by said user, based upon recent history of key characters on said keyboard already contacted by said user-controlled object, and based upon knowledge of approximate current proximity of said user-controlled to said keyboard.
  7. 7. The system of claim 1, wherein said virtual input device is dynamically user-selectable between a keyboard and a digitizer tablet.
  8. 8. The system of claim 1, further including means for calculating velocity of said user-controlled object at least when proximate said virtual input device;
    wherein a contact interaction by said user-controlled object with said virtual input device is adjudicated to occur only if a minimum threshold velocity is exceeded;
    wherein instances of false interactions are reduced.
  9. 9. The system of claim 8, wherein said minimum threshold velocity is user-controlled such that reliability of user interaction with said virtual input device is customizable to said user.
  10. 10. The system of claim 1, further including means for training said user to more efficiently interact with said virtual input device.
  11. 11. The system of claim 1, wherein said means for training includes at least one of (i) means for providing said user with visual feedback, and (ii) means for providing said user with acoustic feedback.
  12. 12. The system of claim 1, further including a tool to enable said user to generate a user-customized template of a virtual input device.
  13. 13. The system of claim 12, wherein said tool enables said user to assign a virtual input device function to a given location defined on said virtual input device.
  14. 14. The system of claim 1, wherein said processor system can discern user gestures as a form of user interaction with said virtual input device.
  15. 15. The system of claim 1, further including means for providing a user-viewable image of said virtual input device.
  16. 16. The system of claim 1, further including an optical system that generates a user-viewable image of said virtual input device.
  17. 17. The system of claim 1, further including an optical system that includes at least one diffractive optical element, said optical system generating a user-viewable image of said virtual input device.
  18. 18. The system of claim 1, further including means for operating said system in at least a low power consumption mode and a higher power consumption mode, wherein selection of power consumption mode is made dynamically as a function of time interval between consecutive user interactions with said virtual input device;
    wherein power consumed by said system is reduced.
  19. 19. The system of claim 1, wherein:
    said single sensor system captures data in frames representing a single image at a given time from which data said three-dimensional coordinate information of a relevant position of at least a portion of said user-controlled object may be determined with respect to said virtual input device from at least one of (i) a single data frame, and (ii) multiple data frames captured at substantially the same time such that a location defined on said virtual input device contacted by said user-controlled object is identifiable.
  20. 20. The system of claim 2, wherein processing tasks associated with operation of said system may be carried out at least in part by a processor associated with said companion system.
  21. 21. The system of claim 1, wherein:
    said virtual input device includes a virtual keyboard; and
    said user-controlled object includes at least a portion of a hand of said user.
  22. 22. A method for a user to interact with a virtual input device using a user-controlled object, the method comprising the following steps:
    (a) acquiring data representing a single image at a given time from a single sensor system, from which data three-dimensional coordinate information of a relevant position of at least a portion of said user-controlled object may be determined such that a location defined on said virtual input device contacted by said user-controlled object is identifiable; and
    (b) processing data acquired at step (a) to determine whether a portion of said user-controlled object contacted a location defined on said virtual input device, and if contacted to determine what function of said virtual input device is associated with said location;
    wherein said method determines if, when in time, and where interaction between said user-controlled object and said virtual input device occurs.
  23. 23. The method of claim 22, further including:
    (c) making available to a companion system information commensurate with contact location determined at step (b), said companion system including at least one device selected from a group consisting of (i) a PDA, (ii) a wireless telephone, (iii) a cellular telephone, (iv) a set-top box, (v) a mobile electronic device, (vi) an electronic device, (vii) a computer, (viii) an appliance adapted to accept input information, and (ix) an electronic system;
    wherein by controlling said user-controlled object a user interacts with said virtual input device to provide information to said companion system.
  24. 24. The method of claim 22, wherein at step (a), said data is acquired using time-of-flight from said single sensor system to a portion of said user-controlled object.
  25. 25. The method of claim 22, further including providing feedback to guide said user in positioning said user-controlled object with respect to said virtual input device, said feedback including at least one type of feedback selected from a group consisting of (i) audible feedback, (ii) audible feedback representing information input by said user-controlled object, (iii) audible feedback representing proximity of said user-controlled object to said virtual input device, (iv) audible feedback representing contact location of said user-controlled object on said virtual input device, (v) visual feedback, (vi) visual feedback representing information input by said user-controlled object, (vii) visual feedback including a display representing proximity of said user-controlled object to said virtual input device, and (viii) visual feedback including a display representing contact location of said user-controlled object with said virtual input device.
  26. 26. The method of claim 22, wherein said virtual input device is a keyboard, and further including providing feedback to guide said user in positioning said user-controlled object with respect to said keyboard, said feedback including at least one type of feedback selected from a group consisting of (i) audible feedback, (ii) audible enunciation of each virtual key's name when said virtual key contacted by said user-controlled object, (iii) an audible key click sound when a virtual key is contacted by said user-controlled object, (iv) an audible key click sound whose sound varies with mode of operation of a virtual key contacted by said user-controlled object, (v) a display of visual feedback, (vi) a display of visual feedback representing at least one key on said keyboard, (vii) a display of visual feedback representing at least one key on said keyboard and at least a portion of said user-controlled object, (viii) a display of visual feedback representing at least two keys on said keyboard keys wherein a key on said keyboard contacted by said user-controlled object is visually distinguishable from adjacent keys on said keyboard, (ix) a display of visual feedback representing information input by said user-controlled object, and (vii) a display of visual feedback representing an image whose position signifies position of said user-object relative to a virtual key when said virtual input device is a virtual keyboard, and wherein size of said image signifies distance from a lower surface of said user-object to said virtual keyboard.
  27. 27. The method of claim 22, wherein said virtual input device is a keyboard, and further including providing a language routine that selects most likely user-intended keystrokes as said user interacts with said keyboard based upon knowledge of language used by said user, based upon recent history of key characters on said keyboard already contacted by said user-controlled object, and based upon knowledge of approximate current proximity of said user-controlled to said keyboard.
  28. 28. The method of claim 22, wherein said virtual input device is dynamically user-selectable between a keyboard and a digitizer tablet.
  29. 29. The method of claim 22, further including providing means for calculating velocity of said user-controlled object at least when proximate said virtual input device;
    wherein a contact interaction by said user-controlled object with said virtual input device is adjudicated to occur only if a minimum threshold velocity is exceeded;
    wherein instances of false interactions are reduced.
  30. 30. The method of claim 29, wherein said minimum threshold velocity is user-controlled such that reliability of user interaction with said virtual input device is customizable to said user.
  31. 31. The method of claim 22, further including providing means for training said user to more efficiently interact with said virtual input device.
  32. 32. The method of claim 31, wherein said means for training includes at least one of (i) means for providing said user with visual feedback, and (ii) means for providing said user with acoustic feedback.
  33. 33. The method of claim 22, further including providing a tool to enable said user to generate a user-customized template of a virtual input device.
  34. 34. The method of claim 33, wherein said tool enables said user to assign a virtual input device function to a given location defined on said virtual input device.
  35. 35. The method of claim 22, wherein step (b) includes discerning user gestures as a form of user interaction with said virtual input device.
  36. 36. The method of claim 22, further including providing a user-viewable image of said virtual input device.
  37. 37. The method of claim 22, further including providing an optical system that generates a user-viewable image of said virtual input device.
  38. 38. The method of claim 22, further including providing an optical system that includes at least one diffractive optical element, said optical system generating a user-viewable image of said virtual input device.
  39. 39. The method of claim 22, further including operating said system in at least a low power consumption mode and a higher power consumption mode, wherein selection of power consumption mode is made dynamically as a function of time interval between consecutive user interactions with said virtual input device;
    wherein power consumed by said system is reduced.
  40. 40. The method of claim 22, wherein step (a) includes capturing data in frames representing a single image at a given time from which data said three-dimensional coordinate information of a relevant position of at least a portion of said user-controlled object may be determined with respect to said virtual input device from at least one of (i) a single data frame, and (ii) multiple data frames captured at substantially the same time such that a location defined on said virtual input device contacted by said user-controlled object is identifiable.
  41. 41. The method of claim 23, wherein at step (b), processing tasks associated with operation of said system may be carried out at least in part by a processor associated with said companion system.
  42. 42. The method of claim 22, wherein:
    said virtual input device includes a virtual keyboard; and
    said user-controlled object includes at least a portion of a hand of said user.
US10651919 1999-11-04 2003-08-29 Method and apparatus for entering data using a virtual input device Abandoned US20040046744A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16344599 true 1999-11-04 1999-11-04
US09502499 US6614422B1 (en) 1999-11-04 2000-02-11 Method and apparatus for entering data using a virtual input device
US10651919 US20040046744A1 (en) 1999-11-04 2003-08-29 Method and apparatus for entering data using a virtual input device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10651919 US20040046744A1 (en) 1999-11-04 2003-08-29 Method and apparatus for entering data using a virtual input device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09502499 Continuation US6614422B1 (en) 1999-11-04 2000-02-11 Method and apparatus for entering data using a virtual input device

Publications (1)

Publication Number Publication Date
US20040046744A1 true true US20040046744A1 (en) 2004-03-11

Family

ID=23998118

Family Applications (2)

Application Number Title Priority Date Filing Date
US09502499 Active US6614422B1 (en) 1999-11-04 2000-02-11 Method and apparatus for entering data using a virtual input device
US10651919 Abandoned US20040046744A1 (en) 1999-11-04 2003-08-29 Method and apparatus for entering data using a virtual input device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09502499 Active US6614422B1 (en) 1999-11-04 2000-02-11 Method and apparatus for entering data using a virtual input device

Country Status (5)

Country Link
US (2) US6614422B1 (en)
EP (1) EP1332488B1 (en)
CN (1) CN1232943C (en)
DE (1) DE60143094D1 (en)
WO (1) WO2001059975A3 (en)

Cited By (176)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020186221A1 (en) * 2001-06-05 2002-12-12 Reactrix Systems, Inc. Interactive video display system
US20040051709A1 (en) * 2002-05-31 2004-03-18 Eit Co., Ltd. Apparatus for controlling the shift of virtual space and method and program for controlling same
US20040085286A1 (en) * 2002-10-31 2004-05-06 Microsoft Corporation Universal computing device
US20040136083A1 (en) * 2002-10-31 2004-07-15 Microsoft Corporation Optical system design for a universal computing device
US20040183775A1 (en) * 2002-12-13 2004-09-23 Reactrix Systems Interactive directed light/sound system
US20040207732A1 (en) * 2002-06-26 2004-10-21 Klony Lieberman Multifunctional integrated image sensor and application to virtual interface technology
US20050111700A1 (en) * 2003-10-03 2005-05-26 O'boyle Michael E. Occupant detection system
US20050122308A1 (en) * 2002-05-28 2005-06-09 Matthew Bell Self-contained interactive video display system
US20050162381A1 (en) * 2002-05-28 2005-07-28 Matthew Bell Self-contained interactive video display system
US20050193292A1 (en) * 2004-01-06 2005-09-01 Microsoft Corporation Enhanced approach of m-array decoding and error correction
WO2005091651A2 (en) * 2004-03-18 2005-09-29 Reactrix Systems, Inc. Interactive video display system
US6968073B1 (en) 2001-04-24 2005-11-22 Automotive Systems Laboratory, Inc. Occupant detection system
US20050275638A1 (en) * 2003-03-28 2005-12-15 Microsoft Corporation Dynamic feedback for gestures
US20050285966A1 (en) * 2004-01-28 2005-12-29 Canesta, Inc. Single chip red, green, blue, distance (RGB-Z) sensor
US20060031786A1 (en) * 2004-08-06 2006-02-09 Hillis W D Method and apparatus continuing action of user gestures performed upon a touch sensitive interactive display in simulation of inertia
US20060101503A1 (en) * 2004-11-09 2006-05-11 Veveo.Tv, Inc. Method and system for performing searches for television content using reduced text input
US20060101499A1 (en) * 2004-11-09 2006-05-11 Veveo, Inc. Method and system for secure sharing, gifting, and purchasing of content on television and mobile devices
US20060101504A1 (en) * 2004-11-09 2006-05-11 Veveo.Tv, Inc. Method and system for performing searches for television content and channels using a non-intrusive television interface and with reduced text input
US20060125799A1 (en) * 2004-08-06 2006-06-15 Hillis W D Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
US20060132432A1 (en) * 2002-05-28 2006-06-22 Matthew Bell Interactive video display system
US20060139314A1 (en) * 2002-05-28 2006-06-29 Matthew Bell Interactive video display system
US20060182309A1 (en) * 2002-10-31 2006-08-17 Microsoft Corporation Passive embedded interaction coding
US20060190836A1 (en) * 2005-02-23 2006-08-24 Wei Ling Su Method and apparatus for data entry input
US20060187199A1 (en) * 2005-02-24 2006-08-24 Vkb Inc. System and method for projection
US20060215913A1 (en) * 2005-03-24 2006-09-28 Microsoft Corporation Maze pattern analysis with image matching
US20060242562A1 (en) * 2005-04-22 2006-10-26 Microsoft Corporation Embedded method for embedded interaction code array
US20060274948A1 (en) * 2005-06-02 2006-12-07 Microsoft Corporation Stroke localization and binding to electronic document
US20060288313A1 (en) * 2004-08-06 2006-12-21 Hillis W D Bounding box gesture recognition on a touch detecting interactive display
US20070005563A1 (en) * 2005-06-30 2007-01-04 Veveo, Inc. Method and system for incremental search with reduced text entry where the relevance of results is a dynamically computed function of user input search string character count
US20070008293A1 (en) * 2005-07-06 2007-01-11 International Business Machines Corporation Touch sensitive device and display
US20070019103A1 (en) * 2005-07-25 2007-01-25 Vkb Inc. Optical apparatus for virtual interface projection and sensing
US20070019099A1 (en) * 2005-07-25 2007-01-25 Vkb Inc. Optical apparatus for virtual interface projection and sensing
US20070041654A1 (en) * 2005-08-17 2007-02-22 Microsoft Corporation Embedded interaction code enabled surface type identification
WO2007025119A2 (en) * 2005-08-26 2007-03-01 Veveo, Inc. User interface for visual cooperation between text input and display device
US20070050337A1 (en) * 2005-08-26 2007-03-01 Veveo, Inc. Method and system for dynamically processing ambiguous, reduced text search queries and highlighting results thereof
US20070046643A1 (en) * 2004-08-06 2007-03-01 Hillis W Daniel State-Based Approach to Gesture Identification
US20070061321A1 (en) * 2005-08-26 2007-03-15 Veveo.Tv, Inc. Method and system for processing ambiguous, multi-term search queries
US20070130128A1 (en) * 2005-11-23 2007-06-07 Veveo, Inc. System and method for finding desired results by incremental search using an ambiguous keypad with the input containing orthographic and typographic errors
US20070135984A1 (en) * 1992-05-05 2007-06-14 Automotive Technologies International, Inc. Arrangement and Method for Obtaining Information Using Phase Difference of Modulated Illumination
US7233321B1 (en) 1998-12-15 2007-06-19 Intel Corporation Pointing device with integrated audio input
US20070216658A1 (en) * 2006-03-17 2007-09-20 Nokia Corporation Mobile communication terminal
US20070219985A1 (en) * 2006-03-06 2007-09-20 Murali Aravamudan Methods and systems for selecting and presenting content based on context sensitive user preferences
US20070222760A1 (en) * 2001-01-08 2007-09-27 Vkb Inc. Data input device
US20070252818A1 (en) * 2006-04-28 2007-11-01 Joseph Zlotnicki Method and apparatus for efficient data input
US20070260703A1 (en) * 2006-01-27 2007-11-08 Sankar Ardhanari Methods and systems for transmission of subsequences of incremental query actions and selection of content items based on later received subsequences
US20070266406A1 (en) * 2004-11-09 2007-11-15 Murali Aravamudan Method and system for performing actions using a non-intrusive television with reduced text input
US20070288456A1 (en) * 2006-04-20 2007-12-13 Murali Aravamudan User interface methods and systems for selecting and presenting content based on user navigation and selection actions associated with the content
US20080007526A1 (en) * 2006-07-10 2008-01-10 Yansun Xu Optical navigation sensor with variable tracking resolution
US20080012824A1 (en) * 2006-07-17 2008-01-17 Anders Grunnet-Jepsen Free-Space Multi-Dimensional Absolute Pointer Using a Projection Marker System
US20080025612A1 (en) * 2004-01-16 2008-01-31 Microsoft Corporation Strokes Localization by m-Array Decoding and Fast Image Matching
US20080062123A1 (en) * 2001-06-05 2008-03-13 Reactrix Systems, Inc. Interactive video display system using strobed light
US20080086704A1 (en) * 2006-10-06 2008-04-10 Veveo, Inc. Methods and systems for a Linear Character Selection Display Interface for Ambiguous Text Input
US20080114743A1 (en) * 2006-03-30 2008-05-15 Veveo, Inc. Method and system for incrementally selecting and providing relevant search engines in response to a user query
US20080150890A1 (en) * 2002-05-28 2008-06-26 Matthew Bell Interactive Video Window
US20080209229A1 (en) * 2006-11-13 2008-08-28 Veveo, Inc. Method of and system for selecting and presenting content based on user identification
US20080252596A1 (en) * 2007-04-10 2008-10-16 Matthew Bell Display Using a Three-Dimensional vision System
US20080297614A1 (en) * 2003-10-31 2008-12-04 Klony Lieberman Optical Apparatus for Virtual Interface Projection and Sensing
US20080313174A1 (en) * 2007-05-25 2008-12-18 Veveo, Inc. Method and system for unified searching across and within multiple documents
US20080313574A1 (en) * 2007-05-25 2008-12-18 Veveo, Inc. System and method for search with reduced physical interaction requirements
US20090027241A1 (en) * 2005-05-31 2009-01-29 Microsoft Corporation Fast error-correcting of embedded interaction codes
US7492445B1 (en) * 2006-06-05 2009-02-17 Cypress Semiconductor Corporation Method and apparatus for robust velocity prediction
US20090077504A1 (en) * 2007-09-14 2009-03-19 Matthew Bell Processing of Gesture-Based User Interactions
US20090083035A1 (en) * 2007-09-25 2009-03-26 Ritchie Winson Huang Text pre-processing for text-to-speech generation
US7536384B2 (en) 2006-09-14 2009-05-19 Veveo, Inc. Methods and systems for dynamically rearranging search results into hierarchically organized concept clusters
US20090146957A1 (en) * 2007-12-10 2009-06-11 Samsung Electronics Co., Ltd. Apparatus and method for providing adaptive on-screen keyboard
US20090158191A1 (en) * 2004-06-15 2009-06-18 Research In Motion Limited Virtual keypad for touchscreen display
US20090167508A1 (en) * 2007-12-31 2009-07-02 Apple Inc. Tactile feedback in an electronic device
US7581182B1 (en) * 2003-07-18 2009-08-25 Nvidia Corporation Apparatus, method, and 3D graphical user interface for media centers
US20090235295A1 (en) * 2003-10-24 2009-09-17 Matthew Bell Method and system for managing an interactive video display system
US20090237361A1 (en) * 2008-03-18 2009-09-24 Microsoft Corporation Virtual keyboard based activation and dismissal
US20090241437A1 (en) * 2008-03-19 2009-10-01 Wolfgang Steinle Embedding unit for display devices
US20090251685A1 (en) * 2007-11-12 2009-10-08 Matthew Bell Lens System
US20090289188A1 (en) * 2008-05-20 2009-11-26 Everspring Industry Co., Ltd. Method for controlling an electronic device through infrared detection
US20100007511A1 (en) * 2008-07-14 2010-01-14 Sony Ericsson Mobile Communications Ab Touchless control of a control device
US20100057465A1 (en) * 2008-09-03 2010-03-04 David Michael Kirsch Variable text-to-speech for automotive application
US20100057464A1 (en) * 2008-08-29 2010-03-04 David Michael Kirsch System and method for variable text-to-speech with minimized distraction to operator of an automotive vehicle
US20100081476A1 (en) * 2008-09-29 2010-04-01 Microsoft Corporation Glow touch feedback for virtual input devices
US20100121866A1 (en) * 2008-06-12 2010-05-13 Matthew Bell Interactive display management systems and methods
US20100125787A1 (en) * 2008-11-20 2010-05-20 Canon Kabushiki Kaisha Information processing apparatus, processing method thereof, and computer-readable storage medium
US20100124949A1 (en) * 2008-11-14 2010-05-20 Sony Ericsson Mobile Communications Ab Portable communication device and remote motion input device
US20100134409A1 (en) * 2008-11-30 2010-06-03 Lenovo (Singapore) Pte. Ltd. Three-dimensional user interface
US20100190548A1 (en) * 2007-06-22 2010-07-29 Wms Gaming Inc. Wagering game machine with virtual input device
US20100214226A1 (en) * 2009-02-23 2010-08-26 International Business Machines Corporation System and method for semi-transparent display of hands over a keyboard in real-time
US20100214267A1 (en) * 2006-06-15 2010-08-26 Nokia Corporation Mobile device with virtual keypad
US20100235786A1 (en) * 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
US20100231522A1 (en) * 2005-02-23 2010-09-16 Zienon, Llc Method and apparatus for data entry input
US20100251161A1 (en) * 2009-03-24 2010-09-30 Microsoft Corporation Virtual keyboard with staggered keys
US20100245349A1 (en) * 2009-03-24 2010-09-30 Disney Enterprises, Inc. System and method for determining placement of a virtual object according to a real-time performance
US7809167B2 (en) 2003-10-24 2010-10-05 Matthew Bell Method and system for processing captured image information in an interactive video display system
US20100253636A1 (en) * 2007-10-24 2010-10-07 Stephen Chen Method for correcting typing errors according to character layout positions on a keyboard
US7826074B1 (en) 2005-02-25 2010-11-02 Microsoft Corporation Fast embedded interaction code printing with custom postscript commands
US20100302165A1 (en) * 2009-05-26 2010-12-02 Zienon, Llc Enabling data entry based on differentiated input objects
US20110090151A1 (en) * 2008-04-18 2011-04-21 Shanghai Hanxiang (Cootek) Information Technology Co., Ltd. System capable of accomplishing flexible keyboard layout
US20110160933A1 (en) * 2009-12-25 2011-06-30 Honda Access Corp. Operation apparatus for on-board devices in automobile
US20110164032A1 (en) * 2010-01-07 2011-07-07 Prime Sense Ltd. Three-Dimensional User Interface
US20110181552A1 (en) * 2002-11-04 2011-07-28 Neonode, Inc. Pressure-sensitive touch screen
US20110191332A1 (en) * 2010-02-04 2011-08-04 Veveo, Inc. Method of and System for Updating Locally Cached Content Descriptor Information
US20110191516A1 (en) * 2010-02-04 2011-08-04 True Xiong Universal touch-screen remote controller
US20110242054A1 (en) * 2010-04-01 2011-10-06 Compal Communication, Inc. Projection system with touch-sensitive projection image
WO2011133986A1 (en) * 2010-04-23 2011-10-27 Luo Tong Method for user input from the back panel of a handheld computerized device
WO2011149515A1 (en) * 2010-05-24 2011-12-01 Will John Temple Multidirectional button, key, and keyboard
US8081822B1 (en) 2005-05-31 2011-12-20 Intellectual Ventures Holding 67 Llc System and method for sensing a feature of an object in an interactive video display
US8098277B1 (en) 2005-12-02 2012-01-17 Intellectual Ventures Holding 67 Llc Systems and methods for communication between a reactive video system and a mobile communication device
US20120038542A1 (en) * 2010-08-16 2012-02-16 Ken Miyashita Information Processing Apparatus, Information Processing Method and Program
US20120059647A1 (en) * 2010-09-08 2012-03-08 International Business Machines Corporation Touchless Texting Exercise
US8156153B2 (en) 2005-04-22 2012-04-10 Microsoft Corporation Global metadata embedding and decoding
US20120120038A1 (en) * 2009-07-23 2012-05-17 Mccarthy John P Display With An Optical Sensor
US8199108B2 (en) 2002-12-13 2012-06-12 Intellectual Ventures Holding 67 Llc Interactive directed light/sound system
US20120182215A1 (en) * 2011-01-18 2012-07-19 Samsung Electronics Co., Ltd. Sensing module, and graphical user interface (gui) control apparatus and method
US8248385B1 (en) 2011-09-13 2012-08-21 Google Inc. User inputs of a touch sensitive device
US8259163B2 (en) 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
US20120229447A1 (en) * 2011-03-08 2012-09-13 Nokia Corporation Apparatus and associated methods
US20120242659A1 (en) * 2011-03-25 2012-09-27 Hon Hai Precision Industry Co., Ltd. Method of controlling electronic device via a virtual keyboard
US20120274658A1 (en) * 2010-10-14 2012-11-01 Chung Hee Sung Method and system for providing background contents of virtual key input device
US20130019191A1 (en) * 2011-07-11 2013-01-17 International Business Machines Corporation Dynamically customizable touch screen keyboard for adapting to user physiology
US20130076697A1 (en) * 2004-04-29 2013-03-28 Neonode Inc. Light-based touch screen
US20130181904A1 (en) * 2012-01-12 2013-07-18 Fujitsu Limited Device and method for detecting finger position
US20130249821A1 (en) * 2011-09-27 2013-09-26 The Board of Trustees of the Leland Stanford, Junior, University Method and System for Virtual Keyboard
US8549424B2 (en) 2007-05-25 2013-10-01 Veveo, Inc. System and method for text disambiguation and context designation in incremental search
WO2013144807A1 (en) * 2012-03-26 2013-10-03 Primesense Ltd. Enhanced virtual touchpad and touchscreen
US20130275907A1 (en) * 2010-10-14 2013-10-17 University of Technology ,Sydney Virtual keyboard
US8577915B2 (en) 2010-09-10 2013-11-05 Veveo, Inc. Method of and system for conducting personalized federated search and presentation of results therefrom
US20130293398A1 (en) * 2012-05-04 2013-11-07 Yong Yan Separation walls on keypads
CN103558948A (en) * 2013-10-31 2014-02-05 中山大学 Man-machine interaction method applied to virtual optical keyboard
US8655021B2 (en) * 2012-06-25 2014-02-18 Imimtek, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US20140098025A1 (en) * 2012-10-09 2014-04-10 Cho-Yi Lin Portable electrical input device capable of docking an electrical communication device and system thereof
US8830312B2 (en) 2012-06-25 2014-09-09 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching within bounded regions
EP2775379A1 (en) * 2013-03-05 2014-09-10 Funai Electric Co., Ltd. Projector
US8850349B2 (en) 2012-04-06 2014-09-30 Google Inc. Smart user-customized graphical keyboard
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US20140354602A1 (en) * 2013-04-12 2014-12-04 Impression.Pi, Inc. Interactive input system and method
US8907889B2 (en) 2005-01-12 2014-12-09 Thinkoptics, Inc. Handheld vision based absolute pointing system
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US8963883B2 (en) 2011-03-17 2015-02-24 Symbol Technologies, Inc. Touchless interactive display system
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
EP2725458A4 (en) * 2011-06-23 2015-05-13 Fujitsu Ltd Information processing device, input control method, and input control program
US20150153950A1 (en) * 2013-12-02 2015-06-04 Industrial Technology Research Institute System and method for receiving user input and program storage medium thereof
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9129155B2 (en) 2013-01-30 2015-09-08 Aquifi, Inc. Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
CN104951073A (en) * 2015-06-19 2015-09-30 济南大学 Gesture interaction method based on virtual interface
US9152258B2 (en) * 2008-06-19 2015-10-06 Neonode Inc. User interface for a touch screen
US9158375B2 (en) 2010-07-20 2015-10-13 Apple Inc. Interactive reality augmentation for natural interaction
US9166714B2 (en) 2009-09-11 2015-10-20 Veveo, Inc. Method of and system for presenting enriched video viewing analytics
US9176598B2 (en) 2007-05-08 2015-11-03 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer with improved performance
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
EP2860611A4 (en) * 2012-06-08 2016-03-02 Kmt Global Inc User interface method and apparatus based on spatial location recognition
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9310905B2 (en) 2010-04-23 2016-04-12 Handscape Inc. Detachable back mounted touchpad for a handheld computerized device
US9310891B2 (en) 2012-09-04 2016-04-12 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US9311724B2 (en) 2010-04-23 2016-04-12 Handscape Inc. Method for user input from alternative touchpads of a handheld computerized device
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9430147B2 (en) 2010-04-23 2016-08-30 Handscape Inc. Method for user input from alternative touchpads of a computerized system
US20160259402A1 (en) * 2015-03-02 2016-09-08 Koji Masuda Contact detection apparatus, projector apparatus, electronic board apparatus, digital signage apparatus, projector system, and contact detection method
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9504920B2 (en) 2011-04-25 2016-11-29 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
US9529523B2 (en) 2010-04-23 2016-12-27 Handscape Inc. Method using a finger above a touchpad for controlling a computerized system
WO2016209520A1 (en) * 2015-06-26 2016-12-29 Intel Corporation Audio augmentation of touch detection for surfaces
EP1580652A3 (en) * 2004-03-26 2017-01-04 Canon Kabushiki Kaisha Information processing apparatus and method
US9542032B2 (en) 2010-04-23 2017-01-10 Handscape Inc. Method using a predicted finger location above a touchpad for controlling a computerized system
US9600078B2 (en) 2012-02-03 2017-03-21 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US9639195B2 (en) 2010-04-23 2017-05-02 Handscape Inc. Method using finger force upon a touchpad for controlling a computerized system
US9678662B2 (en) 2010-04-23 2017-06-13 Handscape Inc. Method for detecting user gestures from alternative touchpads of a handheld computerized device
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
US9891820B2 (en) 2010-04-23 2018-02-13 Handscape Inc. Method for controlling a virtual keyboard from a touchpad of a computerized device
US9891821B2 (en) 2010-04-23 2018-02-13 Handscape Inc. Method for controlling a control region of a computerized device from a touchpad
US9933854B2 (en) 2015-01-16 2018-04-03 Samsung Electronics Co., Ltd. Virtual input device and method for receiving user input using the same

Families Citing this family (336)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080122799A1 (en) * 2001-02-22 2008-05-29 Pryor Timothy R Human interfaces for vehicles, homes, and other applications
US7268774B2 (en) * 1998-08-18 2007-09-11 Candledragon, Inc. Tracking motion of a writing instrument
US20100008551A9 (en) * 1998-08-18 2010-01-14 Ilya Schiller Using handwritten information
US7456820B1 (en) * 1999-05-25 2008-11-25 Silverbrook Research Pty Ltd Hand drawing capture via interface surface
US6614422B1 (en) * 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6611252B1 (en) * 2000-05-17 2003-08-26 Dufaux Douglas P. Virtual data input device
KR100865598B1 (en) * 2000-05-29 2008-10-27 브이케이비 인코포레이티드 Virtual data entry device and method for input of alphanumeric and other data
JP5036949B2 (en) * 2000-06-09 2012-09-26 アイデックス・エーエスエー Pointer device
US6803906B1 (en) * 2000-07-05 2004-10-12 Smart Technologies, Inc. Passive touch system and method of detecting user input
US7058204B2 (en) 2000-10-03 2006-06-06 Gesturetek, Inc. Multiple camera control system
US20110212774A1 (en) * 2008-11-14 2011-09-01 Karl Wudtke Terminal including a button and button having projected images and method
US6906793B2 (en) * 2000-12-11 2005-06-14 Canesta, Inc. Methods and devices for charge management for three-dimensional sensing
US20020061217A1 (en) * 2000-11-17 2002-05-23 Robert Hillman Electronic input device
US6690354B2 (en) * 2000-11-19 2004-02-10 Canesta, Inc. Method for enhancing performance in a system utilizing an array of sensors that sense at least two-dimensions
FI113094B (en) * 2000-12-15 2004-02-27 Nokia Corp An improved method and an arrangement for accomplishing a function in an electronic device and the electronic device
US6943774B2 (en) * 2001-04-02 2005-09-13 Matsushita Electric Industrial Co., Ltd. Portable communication terminal, information display device, control input device and control input method
US6904570B2 (en) * 2001-06-07 2005-06-07 Synaptics, Inc. Method and apparatus for controlling a display of data on a display screen
US6727891B2 (en) * 2001-07-03 2004-04-27 Netmor, Ltd. Input device for personal digital assistants
JP2003152851A (en) * 2001-11-14 2003-05-23 Nec Corp Portable terminal
US7257255B2 (en) * 2001-11-21 2007-08-14 Candledragon, Inc. Capturing hand motion
US20030132950A1 (en) * 2001-11-27 2003-07-17 Fahri Surucu Detecting, classifying, and interpreting input events based on stimuli in multiple sensory domains
JP2003233805A (en) * 2001-12-04 2003-08-22 Canon Inc Image input device
WO2003054683A3 (en) * 2001-12-07 2003-12-31 Canesta Inc User interface for electronic devices
KR20030050741A (en) * 2001-12-19 2003-06-25 삼성전자주식회사 Method for inputting character fast and easily in portable device having limited display size and number of key, and Portable device using the same
US6977643B2 (en) * 2002-01-10 2005-12-20 International Business Machines Corporation System and method implementing non-physical pointers for computer devices
US7071924B2 (en) * 2002-01-10 2006-07-04 International Business Machines Corporation User input method and apparatus for handheld computers
US7340077B2 (en) * 2002-02-15 2008-03-04 Canesta, Inc. Gesture recognition system using depth perceptive sensors
GB2386346B (en) * 2002-03-12 2005-06-15 Eleksen Ltd Flexible foldable keyboard
US20030197685A1 (en) * 2002-04-23 2003-10-23 Leland Yi Wireless keyboard with a built-in web camera
US20030226968A1 (en) * 2002-06-10 2003-12-11 Steve Montellese Apparatus and method for inputting data
JP3630153B2 (en) * 2002-07-19 2005-03-16 ソニー株式会社 Information display input device and the information display input method, and an information processing apparatus
US6922187B2 (en) * 2002-07-23 2005-07-26 International Business Machines Corporation Method and apparatus for implementing a compact portable computer system
US7102615B2 (en) * 2002-07-27 2006-09-05 Sony Computer Entertainment Inc. Man-machine interface using a deformable device
US7151530B2 (en) 2002-08-20 2006-12-19 Canesta, Inc. System and method for determining an input selected by a user through a virtual interface
US20040041716A1 (en) * 2002-08-29 2004-03-04 Compx International Inc. Virtual keyboard and keyboard support arm assembly
US20040041828A1 (en) * 2002-08-30 2004-03-04 Zellhoefer Jon William Adaptive non-contact computer user-interface system and method
US7526120B2 (en) * 2002-09-11 2009-04-28 Canesta, Inc. System and method for providing intelligent airbag deployment
US20040066500A1 (en) * 2002-10-02 2004-04-08 Gokturk Salih Burak Occupancy detection and measurement system and method
US20040075735A1 (en) * 2002-10-17 2004-04-22 Koninklijke Philips Electronics N.V. Method and system for producing a pseudo three-dimensional display utilizing a two-dimensional display device
US6999008B2 (en) * 2002-10-21 2006-02-14 Actisys, Corporation Universal mobile keyboard
US7337410B2 (en) * 2002-11-06 2008-02-26 Julius Lin Virtual workstation
US7774075B2 (en) * 2002-11-06 2010-08-10 Lin Julius J Y Audio-visual three-dimensional input/output
US6954197B2 (en) * 2002-11-15 2005-10-11 Smart Technologies Inc. Size/scale and orientation determination of a pointer in a camera-based touch system
US20040113956A1 (en) * 2002-12-12 2004-06-17 International Business Machines Corporation Apparatus and method for providing feedback regarding finger placement relative to an input device
US20040119690A1 (en) * 2002-12-24 2004-06-24 Watters Scott W. System and method to interact remotely with an application displayed on a display device
US7102617B2 (en) * 2002-12-30 2006-09-05 Motorola, Inc. Compact optical pointing apparatus and method
US7215327B2 (en) * 2002-12-31 2007-05-08 Industrial Technology Research Institute Device and method for generating a virtual keyboard/display
US7194699B2 (en) * 2003-01-14 2007-03-20 Microsoft Corporation Animating images to reflect user selection
JP2006513509A (en) * 2003-02-03 2006-04-20 シーメンス アクチエンゲゼルシヤフトSiemens Aktiengesellschaft Projection of synthetic information
US8508508B2 (en) 2003-02-14 2013-08-13 Next Holdings Limited Touch screen signal processing with single-point calibration
US7629967B2 (en) 2003-02-14 2009-12-08 Next Holdings Limited Touch screen signal processing
US8456447B2 (en) 2003-02-14 2013-06-04 Next Holdings Limited Touch screen signal processing
US7532206B2 (en) 2003-03-11 2009-05-12 Smart Technologies Ulc System and method for differentiating between pointers used to contact touch surface
JP2006523074A (en) * 2003-04-11 2006-10-05 カネスタ インコーポレイテッド Method and system for differential expansion of the dynamic range of the sensor
US7176438B2 (en) * 2003-04-11 2007-02-13 Canesta, Inc. Method and system to differentially enhance sensor dynamic range using enhanced common mode reset
US7382360B2 (en) * 2003-04-15 2008-06-03 Synaptics Incorporated Methods and systems for changing the appearance of a position sensor with a light effect
CA2530987C (en) * 2003-07-03 2012-04-17 Holotouch, Inc. Holographic human-machine interfaces
US7173605B2 (en) * 2003-07-18 2007-02-06 International Business Machines Corporation Method and apparatus for providing projected user interface for computing device
US8217896B2 (en) * 2003-07-31 2012-07-10 Kye Systems Corporation Computer input device for automatically scrolling
US7808479B1 (en) * 2003-09-02 2010-10-05 Apple Inc. Ambidextrous mouse
US8487915B1 (en) 2003-09-11 2013-07-16 Luidia Inc. Mobile device incorporating projector and pen-location transcription system
US7382356B2 (en) * 2003-09-15 2008-06-03 Sharper Image Corp. Input unit for games and musical keyboards
US7411575B2 (en) 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US20070030455A1 (en) * 2003-09-18 2007-02-08 Mikio Inoue Information communication terminal
US7439074B2 (en) * 2003-09-30 2008-10-21 Hoa Duc Nguyen Method of analysis of alcohol by mass spectrometry
US7274356B2 (en) 2003-10-09 2007-09-25 Smart Technologies Inc. Apparatus for determining the location of a pointer within a region of interest
US20050148432A1 (en) * 2003-11-03 2005-07-07 Carmein David E.E. Combined omni-directional treadmill and electronic perception technology
US20050096985A1 (en) * 2003-11-04 2005-05-05 Werden Todd C. Business system and method for a virtual point of sale system in a retail store
JP4611667B2 (en) * 2003-11-25 2011-01-12 健爾 西 An information inputting, storing device, an information input device, and an information processing apparatus
US7543940B2 (en) * 2003-12-16 2009-06-09 Industrial Technology Research Institute Virtual input element image projection apparatus
US20050141752A1 (en) * 2003-12-31 2005-06-30 France Telecom, S.A. Dynamically modifiable keyboard-style interface
WO2005066744A1 (en) 2003-12-31 2005-07-21 Abb Research Ltd A virtual control panel
US7355593B2 (en) 2004-01-02 2008-04-08 Smart Technologies, Inc. Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
WO2005065034A3 (en) * 2004-01-05 2005-11-10 Dikla Hasson System and method for improving typing skills
EP1710665A4 (en) * 2004-01-15 2012-12-26 Vodafone Plc Mobile communication terminal
US7755608B2 (en) * 2004-01-23 2010-07-13 Hewlett-Packard Development Company, L.P. Systems and methods of interfacing with a machine
US7212109B2 (en) * 2004-02-13 2007-05-01 Ge Medical Systems Global Technology Company, Llc Hygienic input device for medical information systems
JP4522113B2 (en) 2004-03-11 2010-08-11 キヤノン株式会社 Coordinate input device
JP4429047B2 (en) 2004-03-11 2010-03-10 キヤノン株式会社 Coordinate input apparatus and its control method, program
JP2005267424A (en) * 2004-03-19 2005-09-29 Fujitsu Ltd Data input device, information processor, data input method and data input program
US7379562B2 (en) * 2004-03-31 2008-05-27 Microsoft Corporation Determining connectedness and offset of 3D objects relative to an interactive surface
US20050227217A1 (en) * 2004-03-31 2005-10-13 Wilson Andrew D Template matching on interactive surface
JP2005293473A (en) * 2004-04-05 2005-10-20 Yokogawa Electric Corp Electronic equipment
US20050225473A1 (en) * 2004-04-08 2005-10-13 Alex Hill Infrared emission sensor
US7706638B1 (en) 2004-04-26 2010-04-27 National Semiconductor Corporation System, apparatus and method for color machine vision with black and white photoelectric sensor
US7460110B2 (en) 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
US7394459B2 (en) * 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
US7538759B2 (en) 2004-05-07 2009-05-26 Next Holdings Limited Touch panel display system with illumination and detection provided from a single edge
US8120596B2 (en) 2004-05-21 2012-02-21 Smart Technologies Ulc Tiled touch system
US20050273201A1 (en) * 2004-06-06 2005-12-08 Zukowski Deborra J Method and system for deployment of sensors
US7787706B2 (en) 2004-06-14 2010-08-31 Microsoft Corporation Method for controlling an intensity of an infrared source used to detect objects adjacent to an interactive display surface
US7593593B2 (en) 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
KR100636483B1 (en) 2004-06-25 2006-10-18 삼성에스디아이 주식회사 Transistor and fabrication method thereof and light emitting display
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US20060036947A1 (en) * 2004-08-10 2006-02-16 Jelley Kevin W User interface controller method and apparatus for a handheld electronic device
US20060050062A1 (en) * 2004-08-19 2006-03-09 Masanori Ozawa Input device
US7576725B2 (en) * 2004-10-19 2009-08-18 Microsoft Corporation Using clear-coded, see-through objects to manipulate virtual objects
KR100663515B1 (en) * 2004-11-08 2007-01-02 삼성전자주식회사 A portable terminal apparatus and method for inputting data for the portable terminal apparatus
DE102005061211A1 (en) * 2004-12-22 2006-09-07 Abb Research Ltd. Man-machine-user interface e.g. mobile telephone, generating method for e.g. controlling industrial robot, involves providing virtual monitoring and/or controlling multi-media object-unit to user for monitoring and/or controlling device
US7467075B2 (en) * 2004-12-23 2008-12-16 Covidien Ag Three-dimensional finite-element code for electrosurgery and thermal ablation simulations
US20130128118A1 (en) * 2004-12-23 2013-05-23 Kuo-Ching Chiang Smart TV with Multiple Sub-Display Windows and the Method of the Same
US20060152482A1 (en) 2005-01-07 2006-07-13 Chauncy Godwin Virtual interface and control device
CN100410857C (en) 2005-01-27 2008-08-13 时代光电科技股份有限公司 Data inputting device
US7539513B2 (en) * 2005-02-02 2009-05-26 National Telephone Products, Inc. Portable phone with ergonomic image projection system
US8009871B2 (en) 2005-02-08 2011-08-30 Microsoft Corporation Method and system to segment depth images and to detect shapes in three-dimensionally acquired data
US20060192763A1 (en) * 2005-02-25 2006-08-31 Ziemkowski Theodore B Sound-based virtual keyboard, device and method
US7656393B2 (en) 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
JP4612853B2 (en) * 2005-03-29 2011-01-12 キヤノン株式会社 Indication position recognition device and an information input apparatus having the same
US7499027B2 (en) * 2005-04-29 2009-03-03 Microsoft Corporation Using a light pointer for input on an interactive display surface
US20060244720A1 (en) * 2005-04-29 2006-11-02 Tracy James L Collapsible projection assembly
US20080246738A1 (en) * 2005-05-04 2008-10-09 Koninklijke Philips Electronics, N.V. System and Method for Projecting Control Graphics
US20060279532A1 (en) * 2005-06-14 2006-12-14 Olszewski Piotr S Data input device controlled by motions of hands and fingers
US7525538B2 (en) * 2005-06-28 2009-04-28 Microsoft Corporation Using same optics to image, illuminate, and project
US7720317B2 (en) * 2005-08-03 2010-05-18 Elan Microelectronics Corporation Handheld image-tracking device within rectangle-shaped two-dimensional sensing array
US7911444B2 (en) * 2005-08-31 2011-03-22 Microsoft Corporation Input method for surface of interactive display
US20070063979A1 (en) * 2005-09-19 2007-03-22 Available For Licensing Systems and methods to provide input/output for a portable data processing device
US20070063982A1 (en) * 2005-09-19 2007-03-22 Tran Bao Q Integrated rendering of sound and image on a display
KR100631779B1 (en) * 2005-10-07 2006-09-27 삼성전자주식회사 Data input apparatus and method for data input detection using the same
US20070114277A1 (en) * 2005-11-21 2007-05-24 International Business Machines Corporation Apparatus and method for commercial transactions
US20070130547A1 (en) * 2005-12-01 2007-06-07 Navisense, Llc Method and system for touchless user interface control
US8279168B2 (en) 2005-12-09 2012-10-02 Edge 3 Technologies Llc Three-dimensional virtual-touch human-machine interface system and method therefor
US8060840B2 (en) 2005-12-29 2011-11-15 Microsoft Corporation Orientation free user interface
WO2007093984A3 (en) * 2006-02-16 2009-04-23 Giora Bar-Sakai A system and method of inputting data into a computing system
US7515143B2 (en) * 2006-02-28 2009-04-07 Microsoft Corporation Uniform illumination of interactive display panel
FR2898315B1 (en) * 2006-03-08 2009-02-20 Peugeot Citroen Automobiles Sa control interface of a fixed or mobile equipment vehicle, using a virtual keyboard
US8334841B2 (en) * 2006-03-13 2012-12-18 Navisense Virtual user interface method and system thereof
US8614669B2 (en) * 2006-03-13 2013-12-24 Navisense Touchless tablet method and system thereof
US8578282B2 (en) * 2006-03-15 2013-11-05 Navisense Visual toolkit for a virtual user interface
KR100756521B1 (en) 2006-05-03 2007-09-10 정성택 Projection keyboard system for child education and method thereof
US7755026B2 (en) * 2006-05-04 2010-07-13 CandleDragon Inc. Generating signals representative of sensed light that is associated with writing being done by a user
WO2007137093A3 (en) * 2006-05-16 2008-07-24 Madentec Systems and methods for a hands free mouse
US7830368B2 (en) 2006-06-06 2010-11-09 3M Innovative Properties Company Keypad with virtual image
US20150121287A1 (en) * 2006-07-03 2015-04-30 Yoram Ben-Meir System for generating and controlling a variably displayable mobile device keypad/virtual keyboard
US8316324B2 (en) * 2006-09-05 2012-11-20 Navisense Method and apparatus for touchless control of a device
US8037414B2 (en) 2006-09-14 2011-10-11 Avaya Inc. Audible computer user interface method and apparatus
US20080176201A1 (en) * 2006-09-20 2008-07-24 Technologies Humanware Canada Inc. User interface for an audio book player
US20080074386A1 (en) * 2006-09-27 2008-03-27 Chia-Hoang Lee Virtual input device and the input method thereof
US20080114615A1 (en) * 2006-11-15 2008-05-15 General Electric Company Methods and systems for gesture-based healthcare application interaction in thin-air display
US9442607B2 (en) 2006-12-04 2016-09-13 Smart Technologies Inc. Interactive input system and method
EP1937032A1 (en) * 2006-12-20 2008-06-25 Electrolux Home Products Corporation N.V. Household appliance
KR100796779B1 (en) 2007-01-02 2008-01-22 주식회사 셀런 Method for transreceiving scan data of wireless input device using multi-dimensional space mapping and wireless input device for the same
US20080166175A1 (en) * 2007-01-05 2008-07-10 Candledragon, Inc. Holding and Using an Electronic Pen and Paper
CN100456219C (en) 2007-01-19 2009-01-28 崔永浩 Keyboard with multiplex keystrokes by physical sensing mode and implementation method thereof
US8212857B2 (en) 2007-01-26 2012-07-03 Microsoft Corporation Alternating light sources to reduce specular reflection
EP2135155B1 (en) 2007-04-11 2013-09-18 Next Holdings, Inc. Touch screen system with hover and click input methods
US7895518B2 (en) * 2007-04-27 2011-02-22 Shapewriter Inc. System and method for preview and selection of words
KR100913962B1 (en) * 2007-05-14 2009-08-26 삼성전자주식회사 Method and apparatus of inputting character in Mobile communication terminal
CN101311881A (en) * 2007-05-25 2008-11-26 佛山市顺德区顺达电脑厂有限公司;神基科技股份有限公司 Electronic device auxiliary input method
US7894197B2 (en) * 2007-06-21 2011-02-22 Coretronic Corporation Optical sensing module and display device using the same
US8955984B2 (en) 2008-06-17 2015-02-17 The Invention Science Fund I, Llc Projection associated methods and systems
US8094137B2 (en) 2007-07-23 2012-01-10 Smart Technologies Ulc System and method of detecting contact on a display
US20090048711A1 (en) * 2007-08-15 2009-02-19 Deline Jonathan E Fuel dispenser
WO2009029767A1 (en) 2007-08-30 2009-03-05 Next Holdings, Inc. Optical touchscreen with improved illumination
WO2009029764A1 (en) 2007-08-30 2009-03-05 Next Holdings, Inc. Low profile touch panel systems
US9001016B2 (en) * 2007-09-19 2015-04-07 Nvidia Corporation Hardware driven display restore mechanism
US9110624B2 (en) 2007-09-21 2015-08-18 Nvdia Corporation Output restoration with input selection
WO2009059479A1 (en) * 2007-11-07 2009-05-14 Pohsien Chiu Input devices with virtual input interfaces
CN101452354B (en) 2007-12-04 2012-08-22 纬创资通股份有限公司 Input method of electronic device, content display method and use thereof
KR101079598B1 (en) * 2007-12-18 2011-11-03 삼성전자주식회사 Display apparatus and control method thereof
US8107250B2 (en) * 2007-12-21 2012-01-31 Coretronic Corporation Display screen and sensor module thereof
US8405636B2 (en) 2008-01-07 2013-03-26 Next Holdings Limited Optical position sensing system and optical position sensor assembly
WO2009099280A3 (en) * 2008-02-05 2009-10-29 Lg Electronics Inc. Input unit and control method thereof
WO2009099296A3 (en) * 2008-02-05 2009-11-05 Lg Electronics Inc. Virtual optical input device for providing various types of interfaces and method of controlling the same
JP4626658B2 (en) * 2008-02-14 2011-02-09 ソニー株式会社 Display device, an imaging device and a position detecting device
US8698753B2 (en) * 2008-02-28 2014-04-15 Lg Electronics Inc. Virtual optical input device with feedback and method of controlling the same
US8902193B2 (en) 2008-05-09 2014-12-02 Smart Technologies Ulc Interactive input system and bezel therefor
WO2009148210A1 (en) * 2008-06-02 2009-12-10 Lg Electronics Inc. Virtual optical input unit and control method thereof
US8608321B2 (en) 2008-06-17 2013-12-17 The Invention Science Fund I, Llc Systems and methods for projecting in response to conformation
US8308304B2 (en) 2008-06-17 2012-11-13 The Invention Science Fund I, Llc Systems associated with receiving and transmitting information related to projection
US8936367B2 (en) 2008-06-17 2015-01-20 The Invention Science Fund I, Llc Systems and methods associated with projecting in response to conformation
US8430515B2 (en) 2008-06-17 2013-04-30 The Invention Science Fund I, Llc Systems and methods for projecting
US20090310038A1 (en) 2008-06-17 2009-12-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Projection in response to position
US8944608B2 (en) 2008-06-17 2015-02-03 The Invention Science Fund I, Llc Systems and methods associated with projecting in response to conformation
US8733952B2 (en) 2008-06-17 2014-05-27 The Invention Science Fund I, Llc Methods and systems for coordinated use of two or more user responsive projectors
US8384005B2 (en) 2008-06-17 2013-02-26 The Invention Science Fund I, Llc Systems and methods for selectively projecting information in response to at least one specified motion associated with pressure applied to at least one projection surface
US8540381B2 (en) 2008-06-17 2013-09-24 The Invention Science Fund I, Llc Systems and methods for receiving information associated with projecting
US20090309826A1 (en) 2008-06-17 2009-12-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Systems and devices
US8641203B2 (en) 2008-06-17 2014-02-04 The Invention Science Fund I, Llc Methods and systems for receiving and transmitting signals between server and projector apparatuses
US8267526B2 (en) 2008-06-17 2012-09-18 The Invention Science Fund I, Llc Methods associated with receiving and transmitting information related to projection
US8723787B2 (en) 2008-06-17 2014-05-13 The Invention Science Fund I, Llc Methods and systems related to an image capture projection surface
JP5015072B2 (en) * 2008-06-18 2012-08-29 株式会社リコー Input device and an image forming apparatus
US8068641B1 (en) 2008-06-19 2011-11-29 Qualcomm Incorporated Interaction interface for controlling an application
US7777899B1 (en) * 2008-06-19 2010-08-17 Gesturetek, Inc. Interaction interface for controlling an application
US8228345B2 (en) * 2008-09-24 2012-07-24 International Business Machines Corporation Hand image feedback method and system
CN101685342B (en) 2008-09-26 2012-01-25 联想(北京)有限公司 Method and device for realizing dynamic virtual keyboard
US8133119B2 (en) 2008-10-01 2012-03-13 Microsoft Corporation Adaptation for alternate gaming input devices
WO2010042880A3 (en) * 2008-10-10 2010-07-29 Neoflect, Inc. Mobile computing device with a virtual keyboard
US8339378B2 (en) 2008-11-05 2012-12-25 Smart Technologies Ulc Interactive input system with multi-angle reflector
GB0822336D0 (en) * 2008-12-08 2009-01-14 Light Blue Optics Ltd Holographic image projection systems
US20100164869A1 (en) * 2008-12-26 2010-07-01 Frank Huang Virtual keyboard structure of electric device and data inputting method thereof
US8294767B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Body scan
US9652030B2 (en) * 2009-01-30 2017-05-16 Microsoft Technology Licensing, Llc Navigation of a virtual plane using a zone of restriction for canceling noise
US8866821B2 (en) 2009-01-30 2014-10-21 Microsoft Corporation Depth map movement tracking via optical flow and velocity prediction
US8295546B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Pose tracking pipeline
JP5201015B2 (en) * 2009-03-09 2013-06-05 ブラザー工業株式会社 Head-mounted display
US8773355B2 (en) 2009-03-16 2014-07-08 Microsoft Corporation Adaptive cursor sizing
US9256282B2 (en) 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
US8988437B2 (en) 2009-03-20 2015-03-24 Microsoft Technology Licensing, Llc Chaining animations
US20100251176A1 (en) * 2009-03-24 2010-09-30 Microsoft Corporation Virtual keyboard with slider buttons
US20110256927A1 (en) 2009-03-25 2011-10-20 MEP Games Inc. Projection of interactive game environment
US9317109B2 (en) * 2012-07-12 2016-04-19 Mep Tech, Inc. Interactive image projection accessory
CN201444297U (en) 2009-03-27 2010-04-28 宸鸿光电科技股份有限公 Touch device, laser source group thereof and laser source structure thereof
DE202009005254U1 (en) 2009-03-27 2009-11-12 Tpk Touch Solutions Inc. Touch device, laser source module and laser source structure
US8340432B2 (en) 2009-05-01 2012-12-25 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
US8942428B2 (en) 2009-05-01 2015-01-27 Microsoft Corporation Isolate extraneous motions
US9498718B2 (en) 2009-05-01 2016-11-22 Microsoft Technology Licensing, Llc Altering a view perspective within a display environment
US8253746B2 (en) 2009-05-01 2012-08-28 Microsoft Corporation Determine intended motions
US8638985B2 (en) 2009-05-01 2014-01-28 Microsoft Corporation Human body pose estimation
US9015638B2 (en) 2009-05-01 2015-04-21 Microsoft Technology Licensing, Llc Binding users to a gesture based system and providing feedback to the users
US8181123B2 (en) 2009-05-01 2012-05-15 Microsoft Corporation Managing virtual port associations to users in a gesture-based computing environment
US9377857B2 (en) 2009-05-01 2016-06-28 Microsoft Technology Licensing, Llc Show body position
US8503720B2 (en) 2009-05-01 2013-08-06 Microsoft Corporation Human body pose estimation
US8649554B2 (en) 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
US8416193B2 (en) * 2009-05-21 2013-04-09 Microsoft Corporation Method of visualizing an input location
US9417700B2 (en) 2009-05-21 2016-08-16 Edge3 Technologies Gesture recognition systems and related methods
US8803889B2 (en) 2009-05-29 2014-08-12 Microsoft Corporation Systems and methods for applying animations or motions to a character
US8320619B2 (en) 2009-05-29 2012-11-27 Microsoft Corporation Systems and methods for tracking a model
US8379101B2 (en) 2009-05-29 2013-02-19 Microsoft Corporation Environment and/or target segmentation
US9400559B2 (en) 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
US9182814B2 (en) 2009-05-29 2015-11-10 Microsoft Technology Licensing, Llc Systems and methods for estimating a non-visible or occluded body part
US8744121B2 (en) 2009-05-29 2014-06-03 Microsoft Corporation Device for identifying and tracking multiple humans over time
US8145594B2 (en) 2009-05-29 2012-03-27 Microsoft Corporation Localized gesture aggregation
US8176442B2 (en) 2009-05-29 2012-05-08 Microsoft Corporation Living cursor control mechanics
US9383823B2 (en) 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US8625837B2 (en) 2009-05-29 2014-01-07 Microsoft Corporation Protocol and format for communicating an image from a camera to a computing environment
US8418085B2 (en) 2009-05-29 2013-04-09 Microsoft Corporation Gesture coach
US8856691B2 (en) 2009-05-29 2014-10-07 Microsoft Corporation Gesture tool
US8509479B2 (en) 2009-05-29 2013-08-13 Microsoft Corporation Virtual object
US8542252B2 (en) 2009-05-29 2013-09-24 Microsoft Corporation Target digitization, extraction, and tracking
US7914344B2 (en) 2009-06-03 2011-03-29 Microsoft Corporation Dual-barrel, connector jack and plug assemblies
US8390680B2 (en) 2009-07-09 2013-03-05 Microsoft Corporation Visual representation expression based on player expression
US9159151B2 (en) 2009-07-13 2015-10-13 Microsoft Technology Licensing, Llc Bringing a visual representation to life via learned input from the user
WO2011011024A1 (en) * 2009-07-23 2011-01-27 Hewlett-Packard Development Company, L.P. Display with an optical sensor
JP5127792B2 (en) * 2009-08-18 2013-01-23 キヤノン株式会社 The information processing apparatus, a control method, program and recording medium
US9141193B2 (en) 2009-08-31 2015-09-22 Microsoft Technology Licensing, Llc Techniques for using human gestures to control gesture unaware programs
DE202009005253U1 (en) 2009-09-09 2010-01-07 Tpk Touch Solutions Inc. Touch device, laser source module and laser source structure
US9122393B2 (en) * 2009-09-30 2015-09-01 At&T Mobility Ii Llc Predictive sensitized keypad
US8810516B2 (en) * 2009-09-30 2014-08-19 At&T Mobility Ii Llc Angular sensitized keypad
US9128610B2 (en) * 2009-09-30 2015-09-08 At&T Mobility Ii Llc Virtual predictive keypad
US8816965B2 (en) * 2009-09-30 2014-08-26 At&T Mobility Ii Llc Predictive force sensitive keypad
US8812972B2 (en) * 2009-09-30 2014-08-19 At&T Intellectual Property I, L.P. Dynamic generation of soft keyboards for mobile devices
US20110074692A1 (en) * 2009-09-30 2011-03-31 At&T Mobility Ii Llc Devices and Methods for Conforming a Virtual Keyboard
JP5506375B2 (en) * 2009-12-25 2014-05-28 キヤノン株式会社 The information processing apparatus and control method thereof
US20110165923A1 (en) 2010-01-04 2011-07-07 Davis Mark L Electronic circle game system
JP5499762B2 (en) * 2010-02-24 2014-05-21 ソニー株式会社 Image processing apparatus, image processing method, program and image processing system
US9665278B2 (en) * 2010-02-26 2017-05-30 Microsoft Technology Licensing, Llc Assisting input from a keyboard
EP2363055A1 (en) 2010-03-01 2011-09-07 Electrolux Home Products Corporation N.V. Projector and household appliance comprising such a projector
GB201004857D0 (en) * 2010-03-23 2010-05-05 Secretpc Ltd Projection device
US8818027B2 (en) 2010-04-01 2014-08-26 Qualcomm Incorporated Computing device interface
US8396252B2 (en) 2010-05-20 2013-03-12 Edge 3 Technologies Systems and related methods for three dimensional gesture recognition in vehicles
US8918734B2 (en) 2010-07-28 2014-12-23 Nuance Communications, Inc. Reduced keyboard with prediction solutions when input is a partial sliding trajectory
US8449118B2 (en) 2010-08-13 2013-05-28 T-Mobile Usa, Inc. Device-adjacent ambiently displayed image
US8451192B2 (en) 2010-08-13 2013-05-28 T-Mobile Usa, Inc. Utilization of interactive device-adjacent ambiently displayed images
US8467599B2 (en) 2010-09-02 2013-06-18 Edge 3 Technologies, Inc. Method and apparatus for confusion learning
US8666144B2 (en) 2010-09-02 2014-03-04 Edge 3 Technologies, Inc. Method and apparatus for determining disparity of texture
US8655093B2 (en) 2010-09-02 2014-02-18 Edge 3 Technologies, Inc. Method and apparatus for performing segmentation of an image
US20120062518A1 (en) * 2010-09-09 2012-03-15 Light Blue Optics Ltd Touch Sensing Systems
DE112010005854T5 (en) * 2010-10-05 2013-08-14 Hewlett-Packard Development Company, L.P. Inputting a command
US20130201102A1 (en) * 2010-10-22 2013-08-08 Sony Ericsson Mobile Communications Ab Mobile communication device with three-dimensional sensing and a method therefore
CN102467298A (en) * 2010-11-18 2012-05-23 西安龙飞软件有限公司 Implementation mode of virtual mobile phone keyboard
US9019239B2 (en) 2010-11-29 2015-04-28 Northrop Grumman Systems Corporation Creative design systems and methods
US8839134B2 (en) * 2010-12-24 2014-09-16 Intel Corporation Projection interface techniques
WO2012089577A1 (en) * 2010-12-30 2012-07-05 Danmarks Tekniske Universitet Input device with three-dimensional image display
US8970589B2 (en) 2011-02-10 2015-03-03 Edge 3 Technologies, Inc. Near-touch interaction with a stereo camera grid structured tessellations
US8582866B2 (en) 2011-02-10 2013-11-12 Edge 3 Technologies, Inc. Method and apparatus for disparity computation in stereo images
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US8928589B2 (en) * 2011-04-20 2015-01-06 Qualcomm Incorporated Virtual keyboards and methods of providing the same
US20120306817A1 (en) * 2011-05-30 2012-12-06 Era Optoelectronics Inc. Floating virtual image touch sensing apparatus
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
JP5914992B2 (en) * 2011-06-02 2016-05-11 ソニー株式会社 Display control device, display control method, and program
JP5880916B2 (en) * 2011-06-03 2016-03-09 ソニー株式会社 The information processing apparatus, information processing method, and program
GB201110156D0 (en) 2011-06-16 2011-07-27 Light Blue Optics Ltd Touch-sensitive display devices
GB201110159D0 (en) 2011-06-16 2011-07-27 Light Blue Optics Ltd Touch sensitive display devices
GB201110157D0 (en) 2011-06-16 2011-07-27 Light Blue Optics Ltd Touch sensitive display devices
US20140189569A1 (en) * 2011-07-18 2014-07-03 Syntellia, Inc. User interface for text input on three dimensional interface
US8971572B1 (en) 2011-08-12 2015-03-03 The Research Foundation For The State University Of New York Hand pointing estimation for human computer interaction
GB201117542D0 (en) 2011-10-11 2011-11-23 Light Blue Optics Ltd Touch-sensitive display devices
US9672609B1 (en) 2011-11-11 2017-06-06 Edge 3 Technologies, Inc. Method and apparatus for improved depth-map estimation
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
CN103176592B (en) * 2011-12-22 2015-09-30 光宝科技股份有限公司 Input virtual projection system and the input detection method
GB201413670D0 (en) 2012-01-20 2014-09-17 Light Blue Optics Ltd Touch sensitive image display devices
WO2013108032A1 (en) 2012-01-20 2013-07-25 Light Blue Optics Limited Touch sensitive image display devices
US9880629B2 (en) 2012-02-24 2018-01-30 Thomas J. Moscarillo Gesture recognition devices and methods with user authentication
GB201205303D0 (en) 2012-03-26 2012-05-09 Light Blue Optics Ltd Touch sensing systems
US8928590B1 (en) * 2012-04-03 2015-01-06 Edge 3 Technologies, Inc. Gesture keyboard method and apparatus
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
KR20130115750A (en) * 2012-04-13 2013-10-22 포항공과대학교 산학협력단 Method for recognizing key input on a virtual keyboard and apparatus for the same
CA2775700C (en) 2012-05-04 2013-07-23 Microsoft Corporation Determining a future portion of a currently presented media program
US9122395B2 (en) * 2012-05-29 2015-09-01 Garett Engle Method of capturing system input by relative finger positioning
US20150235447A1 (en) * 2013-07-12 2015-08-20 Magic Leap, Inc. Method and system for generating map data from an image
CN102778951B (en) * 2012-06-15 2016-02-10 惠州华阳通用电子有限公司 Virtual keys of the input device and input method
JP5962249B2 (en) * 2012-06-21 2016-08-03 富士通株式会社 Character input program, the information processing apparatus and a character input method
DE102012013503B4 (en) * 2012-07-06 2014-10-09 Audi Ag A method and control system for operating a motor vehicle
US9305229B2 (en) 2012-07-30 2016-04-05 Bruno Delean Method and system for vision based interfacing with a computer
CN103729132B (en) * 2012-10-15 2017-09-29 联想(北京)有限公司 A character input method, apparatus, and electronic equipment virtual keyboard
KR20140055173A (en) 2012-10-30 2014-05-09 삼성전자주식회사 Input apparatus and input controlling method thereof
JP2014109876A (en) * 2012-11-30 2014-06-12 Toshiba Corp Information processor, information processing method and program
DE102013000072A1 (en) * 2013-01-08 2014-07-10 Audi Ag Operator interface for a handwritten character input into a device
JP6037900B2 (en) * 2013-03-11 2016-12-07 日立マクセル株式会社 Operation detection device and the operation detection method
JP6171502B2 (en) * 2013-04-04 2017-08-02 船井電機株式会社 An electronic device having a projector and projector function
CN104423578A (en) * 2013-08-25 2015-03-18 何安莉 Interactive Input System And Method
US9778546B2 (en) 2013-08-15 2017-10-03 Mep Tech, Inc. Projector for projecting visible and non-visible images
WO2015030795A1 (en) * 2013-08-30 2015-03-05 Hewlett Packard Development Company, L.P. Touch input association
KR20150057080A (en) * 2013-11-18 2015-05-28 삼성전자주식회사 Apparatas and method for changing a input mode according to input method in an electronic device
US9529465B2 (en) * 2013-12-02 2016-12-27 At&T Intellectual Property I, L.P. Secure interaction with input devices
CN103631382A (en) * 2013-12-20 2014-03-12 大连大学 Laser projection virtual keyboard
CN103809756A (en) * 2014-02-24 2014-05-21 联想(北京)有限公司 Information processing method and electronic device
CN103793061B (en) * 2014-03-03 2017-01-11 联想(北京)有限公司 A control method and an electronic device
FR3020480A1 (en) * 2014-04-24 2015-10-30 Vincent Donnet Device and process control interface of a communication terminal
CN104049772B (en) * 2014-05-30 2017-11-07 北京搜狗科技发展有限公司 An input method, apparatus and system for
US9766806B2 (en) 2014-07-15 2017-09-19 Microsoft Technology Licensing, Llc Holographic keyboard display
CN105451052A (en) * 2014-08-28 2016-03-30 鸿富锦精密工业(深圳)有限公司 Virtual keyboard building method and system
CN105468209A (en) * 2014-09-25 2016-04-06 硕擎科技股份有限公司 Virtual two-dimensional positioning module of input device and virtual input device
US9733048B2 (en) 2015-01-06 2017-08-15 Egismos Technology Corporation Shooting training and game system with virtual target
US9720550B2 (en) 2015-04-21 2017-08-01 Dell Products L.P. Adaptable input active zones at an information handling system projected user interface
US9690400B2 (en) 2015-04-21 2017-06-27 Dell Products L.P. Information handling system interactive totems
US9804718B2 (en) 2015-04-21 2017-10-31 Dell Products L.P. Context based peripheral management for interacting with an information handling system
US9720446B2 (en) 2015-04-21 2017-08-01 Dell Products L.P. Information handling system projected work space calibration
US9921644B2 (en) 2015-04-21 2018-03-20 Dell Products L.P. Information handling system non-linear user interface
US9753591B2 (en) 2015-04-21 2017-09-05 Dell Products L.P. Capacitive mat information handling system display and totem interactions
US9791979B2 (en) * 2015-04-21 2017-10-17 Dell Products L.P. Managing inputs at an information handling system by adaptive infrared illumination and detection
US9983717B2 (en) * 2015-04-21 2018-05-29 Dell Products L.P. Disambiguation of false touch inputs at an information handling system projected user interface
US9804733B2 (en) 2015-04-21 2017-10-31 Dell Products L.P. Dynamic cursor focus in a multi-display information handling system environment
CN104881135B (en) * 2015-05-28 2018-07-03 联想(北京)有限公司 Information processing method, and electronic equipment
CN106325488A (en) * 2015-07-09 2017-01-11 北京搜狗科技发展有限公司 Input method, input device, server and input system
US20170025041A1 (en) * 2015-07-26 2017-01-26 Hagar Shema Apparatus and method for teaching persons
US9898809B2 (en) * 2015-11-10 2018-02-20 Nanjing University Systems, methods and techniques for inputting text into mobile devices using a camera-based keyboard
US20170161903A1 (en) * 2015-12-03 2017-06-08 Calay Venture S.á r.l. Method and apparatus for gesture recognition
CN205540572U (en) * 2016-03-08 2016-08-31 硕擎科技股份有限公司 Supply virtual input device of collocation cell -phone use
USD795349S1 (en) * 2016-05-24 2017-08-22 Tangible Play, Inc. Programming tile
USD795348S1 (en) * 2016-05-24 2017-08-22 Tangible Play, Inc. Programming tile
USD812143S1 (en) * 2016-05-24 2018-03-06 Tangible Play, Inc. Programming tile
USD811485S1 (en) * 2016-05-24 2018-02-27 Tangible Play, Inc. Programming tile
USD811486S1 (en) * 2016-05-24 2018-02-27 Tangible Play, Inc. Programming tile

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5168531A (en) * 1991-06-27 1992-12-01 Digital Equipment Corporation Real-time recognition of pointing information from video
US5198877A (en) * 1990-10-15 1993-03-30 Pixsys, Inc. Method and apparatus for three-dimensional non-contact shape sensing
US5617312A (en) * 1993-11-19 1997-04-01 Hitachi, Ltd. Computer system that enters control information by means of video camera
US5767842A (en) * 1992-02-07 1998-06-16 International Business Machines Corporation Method and device for optical input of commands or data
US5798519A (en) * 1996-02-12 1998-08-25 Golf Age Technologies, Inc. Method of and apparatus for golf driving range distancing using focal plane array
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5969698A (en) * 1993-11-29 1999-10-19 Motorola, Inc. Manually controllable cursor and control panel in a virtual image
US6037882A (en) * 1997-09-30 2000-03-14 Levy; David H. Method and apparatus for inputting data to an electronic system
US6043805A (en) * 1998-03-24 2000-03-28 Hsieh; Kuan-Hong Controlling method for inputting messages to a computer
US6115128A (en) * 1997-09-17 2000-09-05 The Regents Of The Univerity Of California Multi-dimensional position sensor using range detectors
US6121960A (en) * 1996-08-28 2000-09-19 Via, Inc. Touch screen systems and methods
US6229913B1 (en) * 1995-06-07 2001-05-08 The Trustees Of Columbia University In The City Of New York Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two-images due to defocus
US6252598B1 (en) * 1997-07-03 2001-06-26 Lucent Technologies Inc. Video hand image computer interface
US6266048B1 (en) * 1998-08-27 2001-07-24 Hewlett-Packard Company Method and apparatus for a virtual display/keyboard for a PDA
US6281878B1 (en) * 1994-11-01 2001-08-28 Stephen V. R. Montellese Apparatus and method for inputing data
US6424334B1 (en) * 1987-03-17 2002-07-23 Sun Microsystems, Inc. Computer data entry and manipulation apparatus and method
US6512838B1 (en) * 1999-09-22 2003-01-28 Canesta, Inc. Methods for enhancing performance and data acquired from three-dimensional image systems
US6614422B1 (en) * 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6678039B2 (en) * 2001-05-23 2004-01-13 Canesta, Inc. Method and system to enhance dynamic range conversion useable with CMOS three-dimensional imaging

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5174759A (en) * 1988-08-04 1992-12-29 Preston Frank S TV animation interactively controlled by the viewer through input above a book page
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
DE19539955A1 (en) * 1995-10-26 1997-04-30 Sick Ag Optical detection means
US6064354A (en) * 1998-07-01 2000-05-16 Deluca; Michael Joseph Stereoscopic user interface method and apparatus

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6424334B1 (en) * 1987-03-17 2002-07-23 Sun Microsystems, Inc. Computer data entry and manipulation apparatus and method
US5198877A (en) * 1990-10-15 1993-03-30 Pixsys, Inc. Method and apparatus for three-dimensional non-contact shape sensing
US5168531A (en) * 1991-06-27 1992-12-01 Digital Equipment Corporation Real-time recognition of pointing information from video
US5767842A (en) * 1992-02-07 1998-06-16 International Business Machines Corporation Method and device for optical input of commands or data
US5617312A (en) * 1993-11-19 1997-04-01 Hitachi, Ltd. Computer system that enters control information by means of video camera
US5969698A (en) * 1993-11-29 1999-10-19 Motorola, Inc. Manually controllable cursor and control panel in a virtual image
US6281878B1 (en) * 1994-11-01 2001-08-28 Stephen V. R. Montellese Apparatus and method for inputing data
US6229913B1 (en) * 1995-06-07 2001-05-08 The Trustees Of Columbia University In The City Of New York Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two-images due to defocus
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5798519A (en) * 1996-02-12 1998-08-25 Golf Age Technologies, Inc. Method of and apparatus for golf driving range distancing using focal plane array
US6121960A (en) * 1996-08-28 2000-09-19 Via, Inc. Touch screen systems and methods
US6252598B1 (en) * 1997-07-03 2001-06-26 Lucent Technologies Inc. Video hand image computer interface
US6115128A (en) * 1997-09-17 2000-09-05 The Regents Of The Univerity Of California Multi-dimensional position sensor using range detectors
US6037882A (en) * 1997-09-30 2000-03-14 Levy; David H. Method and apparatus for inputting data to an electronic system
US6043805A (en) * 1998-03-24 2000-03-28 Hsieh; Kuan-Hong Controlling method for inputting messages to a computer
US6266048B1 (en) * 1998-08-27 2001-07-24 Hewlett-Packard Company Method and apparatus for a virtual display/keyboard for a PDA
US6512838B1 (en) * 1999-09-22 2003-01-28 Canesta, Inc. Methods for enhancing performance and data acquired from three-dimensional image systems
US6674895B2 (en) * 1999-09-22 2004-01-06 Canesta, Inc. Methods for enhancing performance and data acquired from three-dimensional image systems
US6614422B1 (en) * 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6678039B2 (en) * 2001-05-23 2004-01-13 Canesta, Inc. Method and system to enhance dynamic range conversion useable with CMOS three-dimensional imaging

Cited By (359)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070135984A1 (en) * 1992-05-05 2007-06-14 Automotive Technologies International, Inc. Arrangement and Method for Obtaining Information Using Phase Difference of Modulated Illumination
US7831358B2 (en) 1992-05-05 2010-11-09 Automotive Technologies International, Inc. Arrangement and method for obtaining information using phase difference of modulated illumination
US7804494B2 (en) 1998-12-15 2010-09-28 Intel Corporation Pointing device with integrated audio input and associated methods
US7656397B2 (en) 1998-12-15 2010-02-02 Intel Corporation Pointing device with integrated audio input and associated methods
US20080170049A1 (en) * 1998-12-15 2008-07-17 Larson Jim A Pointing device with integrated audio input and associated methods
US7233321B1 (en) 1998-12-15 2007-06-19 Intel Corporation Pointing device with integrated audio input
US20070222760A1 (en) * 2001-01-08 2007-09-27 Vkb Inc. Data input device
US7893924B2 (en) 2001-01-08 2011-02-22 Vkb Inc. Data input device
US6968073B1 (en) 2001-04-24 2005-11-22 Automotive Systems Laboratory, Inc. Occupant detection system
US7834846B1 (en) 2001-06-05 2010-11-16 Matthew Bell Interactive video display system
US20020186221A1 (en) * 2001-06-05 2002-12-12 Reactrix Systems, Inc. Interactive video display system
US20080062123A1 (en) * 2001-06-05 2008-03-13 Reactrix Systems, Inc. Interactive video display system using strobed light
US7259747B2 (en) 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US8300042B2 (en) 2001-06-05 2012-10-30 Microsoft Corporation Interactive video display system using strobed light
US20080150890A1 (en) * 2002-05-28 2008-06-26 Matthew Bell Interactive Video Window
US8035624B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Computer vision based touch screen
US20080150913A1 (en) * 2002-05-28 2008-06-26 Matthew Bell Computer vision based touch screen
US8035614B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Interactive video window
US20050162381A1 (en) * 2002-05-28 2005-07-28 Matthew Bell Self-contained interactive video display system
US20060139314A1 (en) * 2002-05-28 2006-06-29 Matthew Bell Interactive video display system
US20050122308A1 (en) * 2002-05-28 2005-06-09 Matthew Bell Self-contained interactive video display system
US7170492B2 (en) 2002-05-28 2007-01-30 Reactrix Systems, Inc. Interactive video display system
US7710391B2 (en) 2002-05-28 2010-05-04 Matthew Bell Processing an image utilizing a spatially varying pattern
US20060132432A1 (en) * 2002-05-28 2006-06-22 Matthew Bell Interactive video display system
US8035612B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
US7348963B2 (en) 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
US7477243B2 (en) * 2002-05-31 2009-01-13 Eit Co., Ltd. Apparatus for controlling the shift of virtual space and method and program for controlling same
US20040051709A1 (en) * 2002-05-31 2004-03-18 Eit Co., Ltd. Apparatus for controlling the shift of virtual space and method and program for controlling same
US20040207732A1 (en) * 2002-06-26 2004-10-21 Klony Lieberman Multifunctional integrated image sensor and application to virtual interface technology
US7417681B2 (en) * 2002-06-26 2008-08-26 Vkb Inc. Multifunctional integrated image sensor and application to virtual interface technology
US20040136083A1 (en) * 2002-10-31 2004-07-15 Microsoft Corporation Optical system design for a universal computing device
US20060182309A1 (en) * 2002-10-31 2006-08-17 Microsoft Corporation Passive embedded interaction coding
US7133031B2 (en) * 2002-10-31 2006-11-07 Microsoft Corporation Optical system design for a universal computing device
US20060109263A1 (en) * 2002-10-31 2006-05-25 Microsoft Corporation Universal computing device
US7684618B2 (en) 2002-10-31 2010-03-23 Microsoft Corporation Passive embedded interaction coding
US20040085286A1 (en) * 2002-10-31 2004-05-06 Microsoft Corporation Universal computing device
US7009594B2 (en) 2002-10-31 2006-03-07 Microsoft Corporation Universal computing device
US20110181552A1 (en) * 2002-11-04 2011-07-28 Neonode, Inc. Pressure-sensitive touch screen
US8896575B2 (en) * 2002-11-04 2014-11-25 Neonode Inc. Pressure-sensitive touch screen
US20040183775A1 (en) * 2002-12-13 2004-09-23 Reactrix Systems Interactive directed light/sound system
US8199108B2 (en) 2002-12-13 2012-06-12 Intellectual Ventures Holding 67 Llc Interactive directed light/sound system
US7576727B2 (en) 2002-12-13 2009-08-18 Matthew Bell Interactive directed light/sound system
US20050275638A1 (en) * 2003-03-28 2005-12-15 Microsoft Corporation Dynamic feedback for gestures
US7886236B2 (en) * 2003-03-28 2011-02-08 Microsoft Corporation Dynamic feedback for gestures
US7581182B1 (en) * 2003-07-18 2009-08-25 Nvidia Corporation Apparatus, method, and 3D graphical user interface for media centers
US7406181B2 (en) 2003-10-03 2008-07-29 Automotive Systems Laboratory, Inc. Occupant detection system
US20050111700A1 (en) * 2003-10-03 2005-05-26 O'boyle Michael E. Occupant detection system
US20090235295A1 (en) * 2003-10-24 2009-09-17 Matthew Bell Method and system for managing an interactive video display system
US7809167B2 (en) 2003-10-24 2010-10-05 Matthew Bell Method and system for processing captured image information in an interactive video display system
US8487866B2 (en) 2003-10-24 2013-07-16 Intellectual Ventures Holding 67 Llc Method and system for managing an interactive video display system
US20080297614A1 (en) * 2003-10-31 2008-12-04 Klony Lieberman Optical Apparatus for Virtual Interface Projection and Sensing
US20050193292A1 (en) * 2004-01-06 2005-09-01 Microsoft Corporation Enhanced approach of m-array decoding and error correction
US20080025612A1 (en) * 2004-01-16 2008-01-31 Microsoft Corporation Strokes Localization by m-Array Decoding and Fast Image Matching
US8139141B2 (en) 2004-01-28 2012-03-20 Microsoft Corporation Single chip red, green, blue, distance (RGB-Z) sensor
US20050285966A1 (en) * 2004-01-28 2005-12-29 Canesta, Inc. Single chip red, green, blue, distance (RGB-Z) sensor
WO2005091651A3 (en) * 2004-03-18 2006-01-12 Reactrix Systems Inc Interactive video display system
WO2005091651A2 (en) * 2004-03-18 2005-09-29 Reactrix Systems, Inc. Interactive video display system
EP1580652A3 (en) * 2004-03-26 2017-01-04 Canon Kabushiki Kaisha Information processing apparatus and method
US20130076697A1 (en) * 2004-04-29 2013-03-28 Neonode Inc. Light-based touch screen
US20090158191A1 (en) * 2004-06-15 2009-06-18 Research In Motion Limited Virtual keypad for touchscreen display
US7907124B2 (en) 2004-08-06 2011-03-15 Touchtable, Inc. Method and apparatus continuing action of user gestures performed upon a touch sensitive interactive display in simulation of inertia
US8269739B2 (en) 2004-08-06 2012-09-18 Touchtable, Inc. Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
US20070046643A1 (en) * 2004-08-06 2007-03-01 Hillis W Daniel State-Based Approach to Gesture Identification
US8072439B2 (en) 2004-08-06 2011-12-06 Touchtable, Inc. Touch detecting interactive display
US7724242B2 (en) 2004-08-06 2010-05-25 Touchtable, Inc. Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
US8624863B2 (en) 2004-08-06 2014-01-07 Qualcomm Incorporated Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
US8188985B2 (en) 2004-08-06 2012-05-29 Touchtable, Inc. Method and apparatus continuing action of user gestures performed upon a touch sensitive interactive display in simulation of inertia
US7719523B2 (en) 2004-08-06 2010-05-18 Touchtable, Inc. Bounding box gesture recognition on a touch detecting interactive display
WO2006017695A3 (en) * 2004-08-06 2006-07-06 Applied Minds Inc Touch detecting interactive display backgroung
US8669958B2 (en) 2004-08-06 2014-03-11 Qualcomm Incorporated Method and apparatus continuing action of user gestures performed upon a touch sensitive interactive display in simulation of inertia
US20060125799A1 (en) * 2004-08-06 2006-06-15 Hillis W D Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
US20100318904A1 (en) * 2004-08-06 2010-12-16 Touchtable, Inc. Method and apparatus continuing action of user gestures performed upon a touch sensitive interactive display in simulation of inertia
US20100039446A1 (en) * 2004-08-06 2010-02-18 Applied Minds, Inc. Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
US8692792B2 (en) 2004-08-06 2014-04-08 Qualcomm Incorporated Bounding box gesture recognition on a touch detecting interactive display
WO2006017695A2 (en) * 2004-08-06 2006-02-16 Applied Minds, Inc. Touch detecting interactive display backgroung
US8139043B2 (en) 2004-08-06 2012-03-20 Touchtable, Inc. Bounding box gesture recognition on a touch detecting interactive display
US8665239B2 (en) 2004-08-06 2014-03-04 Qualcomm Incorporated Method and apparatus continuing action of user gestures performed upon a touch sensitive interactive display in simulation of inertia
US20060288313A1 (en) * 2004-08-06 2006-12-21 Hillis W D Bounding box gesture recognition on a touch detecting interactive display
US20060031786A1 (en) * 2004-08-06 2006-02-09 Hillis W D Method and apparatus continuing action of user gestures performed upon a touch sensitive interactive display in simulation of inertia
US20100117979A1 (en) * 2004-08-06 2010-05-13 Touchtable, Inc. Bounding box gesture recognition on a touch detecting interactive display
US7728821B2 (en) 2004-08-06 2010-06-01 Touchtable, Inc. Touch detecting interactive display
US20070266406A1 (en) * 2004-11-09 2007-11-15 Murali Aravamudan Method and system for performing actions using a non-intrusive television with reduced text input
US20060101504A1 (en) * 2004-11-09 2006-05-11 Veveo.Tv, Inc. Method and system for performing searches for television content and channels using a non-intrusive television interface and with reduced text input
US20060101499A1 (en) * 2004-11-09 2006-05-11 Veveo, Inc. Method and system for secure sharing, gifting, and purchasing of content on television and mobile devices
US7895218B2 (en) 2004-11-09 2011-02-22 Veveo, Inc. Method and system for performing searches for television content using reduced text input
US9135337B2 (en) 2004-11-09 2015-09-15 Veveo, Inc. Method and system for performing searches for television content using reduced text input
US20060101503A1 (en) * 2004-11-09 2006-05-11 Veveo.Tv, Inc. Method and system for performing searches for television content using reduced text input
US8907889B2 (en) 2005-01-12 2014-12-09 Thinkoptics, Inc. Handheld vision based absolute pointing system
US20100231522A1 (en) * 2005-02-23 2010-09-16 Zienon, Llc Method and apparatus for data entry input
US9274551B2 (en) 2005-02-23 2016-03-01 Zienon, Llc Method and apparatus for data entry input
US9122316B2 (en) 2005-02-23 2015-09-01 Zienon, Llc Enabling data entry based on differentiated input objects
US20060190836A1 (en) * 2005-02-23 2006-08-24 Wei Ling Su Method and apparatus for data entry input
US9760214B2 (en) 2005-02-23 2017-09-12 Zienon, Llc Method and apparatus for data entry input
US7670006B2 (en) 2005-02-24 2010-03-02 Vkb Inc. System and method for projection
US8243015B2 (en) 2005-02-24 2012-08-14 Vkb Inc. Virtual data entry device
US20060187199A1 (en) * 2005-02-24 2006-08-24 Vkb Inc. System and method for projection
US20060187198A1 (en) * 2005-02-24 2006-08-24 Vkb Inc. Input device
US7826074B1 (en) 2005-02-25 2010-11-02 Microsoft Corporation Fast embedded interaction code printing with custom postscript commands
US20060215913A1 (en) * 2005-03-24 2006-09-28 Microsoft Corporation Maze pattern analysis with image matching
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US8156153B2 (en) 2005-04-22 2012-04-10 Microsoft Corporation Global metadata embedding and decoding
US20060242562A1 (en) * 2005-04-22 2006-10-26 Microsoft Corporation Embedded method for embedded interaction code array
US8081822B1 (en) 2005-05-31 2011-12-20 Intellectual Ventures Holding 67 Llc System and method for sensing a feature of an object in an interactive video display
US7729539B2 (en) 2005-05-31 2010-06-01 Microsoft Corporation Fast error-correcting of embedded interaction codes
US20090027241A1 (en) * 2005-05-31 2009-01-29 Microsoft Corporation Fast error-correcting of embedded interaction codes
US20060274948A1 (en) * 2005-06-02 2006-12-07 Microsoft Corporation Stroke localization and binding to electronic document
US9031962B2 (en) 2005-06-30 2015-05-12 Veveo, Inc. Method and system for incremental search with reduced text entry where the relevance of results is a dynamically computed function of user input search string character count
US8122034B2 (en) 2005-06-30 2012-02-21 Veveo, Inc. Method and system for incremental search with reduced text entry where the relevance of results is a dynamically computed function of user input search string character count
US20070005563A1 (en) * 2005-06-30 2007-01-04 Veveo, Inc. Method and system for incremental search with reduced text entry where the relevance of results is a dynamically computed function of user input search string character count
US20070008293A1 (en) * 2005-07-06 2007-01-11 International Business Machines Corporation Touch sensitive device and display
US20070019103A1 (en) * 2005-07-25 2007-01-25 Vkb Inc. Optical apparatus for virtual interface projection and sensing
US20070019099A1 (en) * 2005-07-25 2007-01-25 Vkb Inc. Optical apparatus for virtual interface projection and sensing
WO2007019443A1 (en) * 2005-08-05 2007-02-15 Reactrix Systems, Inc. Interactive video display system
US7817816B2 (en) 2005-08-17 2010-10-19 Microsoft Corporation Embedded interaction code enabled surface type identification
US20070041654A1 (en) * 2005-08-17 2007-02-22 Microsoft Corporation Embedded interaction code enabled surface type identification
US20110173205A1 (en) * 2005-08-26 2011-07-14 Veveo, Inc. Method and system for dynamically processing ambiguous, reduced text search queries and highlighting results thereof
US7779011B2 (en) 2005-08-26 2010-08-17 Veveo, Inc. Method and system for dynamically processing ambiguous, reduced text search queries and highlighting results thereof
WO2007025119A3 (en) * 2005-08-26 2007-11-22 Murali Aravamudan User interface for visual cooperation between text input and display device
US9177081B2 (en) 2005-08-26 2015-11-03 Veveo, Inc. Method and system for processing ambiguous, multi-term search queries
US7937394B2 (en) 2005-08-26 2011-05-03 Veveo, Inc. Method and system for dynamically processing ambiguous, reduced text search queries and highlighting results thereof
WO2007025119A2 (en) * 2005-08-26 2007-03-01 Veveo, Inc. User interface for visual cooperation between text input and display device
US8433696B2 (en) 2005-08-26 2013-04-30 Veveo, Inc. Method and system for processing ambiguous, multiterm search queries
US20070050337A1 (en) * 2005-08-26 2007-03-01 Veveo, Inc. Method and system for dynamically processing ambiguous, reduced text search queries and highlighting results thereof
US7788266B2 (en) 2005-08-26 2010-08-31 Veveo, Inc. Method and system for processing ambiguous, multi-term search queries
US7737999B2 (en) 2005-08-26 2010-06-15 Veveo, Inc. User interface for visual cooperation between text input and display device
US20070061321A1 (en) * 2005-08-26 2007-03-15 Veveo.Tv, Inc. Method and system for processing ambiguous, multi-term search queries
US20100306691A1 (en) * 2005-08-26 2010-12-02 Veveo, Inc. User Interface for Visual Cooperation Between Text Input and Display Device
US20070061754A1 (en) * 2005-08-26 2007-03-15 Veveo, Inc. User interface for visual cooperation between text input and display device
US20070130128A1 (en) * 2005-11-23 2007-06-07 Veveo, Inc. System and method for finding desired results by incremental search using an ambiguous keypad with the input containing orthographic and typographic errors
US7644054B2 (en) 2005-11-23 2010-01-05 Veveo, Inc. System and method for finding desired results by incremental search using an ambiguous keypad with the input containing orthographic and typographic errors
US8370284B2 (en) 2005-11-23 2013-02-05 Veveo, Inc. System and method for finding desired results by incremental search using an ambiguous keypad with the input containing orthographic and/or typographic errors
US20100153380A1 (en) * 2005-11-23 2010-06-17 Veveo, Inc. System And Method For Finding Desired Results By Incremental Search Using An Ambiguous Keypad With The Input Containing Orthographic And/Or Typographic Errors
US8098277B1 (en) 2005-12-02 2012-01-17 Intellectual Ventures Holding 67 Llc Systems and methods for communication between a reactive video system and a mobile communication device
US20070260703A1 (en) * 2006-01-27 2007-11-08 Sankar Ardhanari Methods and systems for transmission of subsequences of incremental query actions and selection of content items based on later received subsequences
US7739280B2 (en) 2006-03-06 2010-06-15 Veveo, Inc. Methods and systems for selecting and presenting content based on user preference information extracted from an aggregate preference signature
US20070271205A1 (en) * 2006-03-06 2007-11-22 Murali Aravamudan Methods and systems for selecting and presenting content based on learned periodicity of user content selection
US7774294B2 (en) 2006-03-06 2010-08-10 Veveo, Inc. Methods and systems for selecting and presenting content based on learned periodicity of user content selection
US8073848B2 (en) 2006-03-06 2011-12-06 Veveo, Inc. Methods and systems for selecting and presenting content based on user preference information extracted from an aggregate preference signature
US8156113B2 (en) 2006-03-06 2012-04-10 Veveo, Inc. Methods and systems for selecting and presenting content based on dynamically identifying microgenres associated with the content
US8825576B2 (en) 2006-03-06 2014-09-02 Veveo, Inc. Methods and systems for selecting and presenting content on a first system based on user preferences learned on a second system
US7792815B2 (en) 2006-03-06 2010-09-07 Veveo, Inc. Methods and systems for selecting and presenting content based on context sensitive user preferences
US20100241625A1 (en) * 2006-03-06 2010-09-23 Veveo, Inc. Methods and Systems for Selecting and Presenting Content Based on User Preference Information Extracted from an Aggregate Preference Signature
US8543516B2 (en) 2006-03-06 2013-09-24 Veveo, Inc. Methods and systems for selecting and presenting content on a first system based on user preferences learned on a second system
US8943083B2 (en) 2006-03-06 2015-01-27 Veveo, Inc. Methods and systems for segmenting relative user preferences into fine-grain and coarse-grain collections
US7774341B2 (en) 2006-03-06 2010-08-10 Veveo, Inc. Methods and systems for selecting and presenting content based on dynamically identifying microgenres associated with the content
US8380726B2 (en) 2006-03-06 2013-02-19 Veveo, Inc. Methods and systems for selecting and presenting content based on a comparison of preference signatures from multiple users
US20070266021A1 (en) * 2006-03-06 2007-11-15 Murali Aravamudan Methods and systems for selecting and presenting content based on dynamically identifying microgenres associated with the content
US7835998B2 (en) 2006-03-06 2010-11-16 Veveo, Inc. Methods and systems for selecting and presenting content on a first system based on user preferences learned on a second system
US20100293160A1 (en) * 2006-03-06 2010-11-18 Murali Aravamudan Methods and Systems for Selecting and Presenting Content Based on Learned Periodicity of User Content Selection
US9075861B2 (en) 2006-03-06 2015-07-07 Veveo, Inc. Methods and systems for segmenting relative user preferences into fine-grain and coarse-grain collections
US9092503B2 (en) 2006-03-06 2015-07-28 Veveo, Inc. Methods and systems for selecting and presenting content based on dynamically identifying microgenres associated with the content
US20070219985A1 (en) * 2006-03-06 2007-09-20 Murali Aravamudan Methods and systems for selecting and presenting content based on context sensitive user preferences
US20100325111A1 (en) * 2006-03-06 2010-12-23 Veveo, Inc. Methods and Systems for Selecting and Presenting Content Based on Context Sensitive User Preferences
US9128987B2 (en) 2006-03-06 2015-09-08 Veveo, Inc. Methods and systems for selecting and presenting content based on a comparison of preference signatures from multiple users
US7885904B2 (en) 2006-03-06 2011-02-08 Veveo, Inc. Methods and systems for selecting and presenting content on a first system based on user preferences learned on a second system
US8429188B2 (en) 2006-03-06 2013-04-23 Veveo, Inc. Methods and systems for selecting and presenting content based on context sensitive user preferences
US8112454B2 (en) 2006-03-06 2012-02-07 Veveo, Inc. Methods and systems for ordering content items according to learned user preferences
US9213755B2 (en) 2006-03-06 2015-12-15 Veveo, Inc. Methods and systems for selecting and presenting content based on context sensitive user preferences
US20070219984A1 (en) * 2006-03-06 2007-09-20 Murali Aravamudan Methods and systems for selecting and presenting content based on a comparison of preference signatures from multiple users
US8478794B2 (en) 2006-03-06 2013-07-02 Veveo, Inc. Methods and systems for segmenting relative user preferences into fine-grain and coarse-grain collections
US8438160B2 (en) 2006-03-06 2013-05-07 Veveo, Inc. Methods and systems for selecting and presenting content based on dynamically identifying Microgenres Associated with the content
US8949231B2 (en) 2006-03-06 2015-02-03 Veveo, Inc. Methods and systems for selecting and presenting content based on activity level spikes associated with the content
US7949627B2 (en) 2006-03-06 2011-05-24 Veveo, Inc. Methods and systems for selecting and presenting content based on learned periodicity of user content selection
US20110131161A1 (en) * 2006-03-06 2011-06-02 Veveo, Inc. Methods and Systems for Selecting and Presenting Content on a First System Based on User Preferences Learned on a Second System
US8429155B2 (en) 2006-03-06 2013-04-23 Veveo, Inc. Methods and systems for selecting and presenting content based on activity level spikes associated with the content
US8583566B2 (en) 2006-03-06 2013-11-12 Veveo, Inc. Methods and systems for selecting and presenting content based on learned periodicity of user content selection
US20070266026A1 (en) * 2006-03-06 2007-11-15 Murali Aravamudan Methods and systems for selecting and presenting content based on user preference information extracted from an aggregate preference signature
US7777728B2 (en) * 2006-03-17 2010-08-17 Nokia Corporation Mobile communication terminal
US20070216658A1 (en) * 2006-03-17 2007-09-20 Nokia Corporation Mobile communication terminal
US8417717B2 (en) 2006-03-30 2013-04-09 Veveo Inc. Method and system for incrementally selecting and providing relevant search engines in response to a user query
US8073860B2 (en) 2006-03-30 2011-12-06 Veveo, Inc. Method and system for incrementally selecting and providing relevant search engines in response to a user query
US9223873B2 (en) 2006-03-30 2015-12-29 Veveo, Inc. Method and system for incrementally selecting and providing relevant search engines in response to a user query
US20080114743A1 (en) * 2006-03-30 2008-05-15 Veveo, Inc. Method and system for incrementally selecting and providing relevant search engines in response to a user query
US8688746B2 (en) 2006-04-20 2014-04-01 Veveo, Inc. User interface methods and systems for selecting and presenting content based on user relationships
US8423583B2 (en) 2006-04-20 2013-04-16 Veveo Inc. User interface methods and systems for selecting and presenting content based on user relationships
US7461061B2 (en) 2006-04-20 2008-12-02 Veveo, Inc. User interface methods and systems for selecting and presenting content based on user navigation and selection actions associated with the content
US9087109B2 (en) 2006-04-20 2015-07-21 Veveo, Inc. User interface methods and systems for selecting and presenting content based on user relationships
US20070288456A1 (en) * 2006-04-20 2007-12-13 Murali Aravamudan User interface methods and systems for selecting and presenting content based on user navigation and selection actions associated with the content
US7539676B2 (en) 2006-04-20 2009-05-26 Veveo, Inc. User interface methods and systems for selecting and presenting content based on relationships between the user and other members of an organization
US20070288457A1 (en) * 2006-04-20 2007-12-13 Murali Aravamudan User interface methods and systems for selecting and presenting content based on relationships between the user and other members of an organization
US7899806B2 (en) 2006-04-20 2011-03-01 Veveo, Inc. User interface methods and systems for selecting and presenting content based on user navigation and selection actions associated with the content
US20090077496A1 (en) * 2006-04-20 2009-03-19 Veveo, Inc. User interface methods and systems for selecting and presenting content based on user navigation and selection actions associated with the content
US8086602B2 (en) 2006-04-20 2011-12-27 Veveo Inc. User interface methods and systems for selecting and presenting content based on user navigation and selection actions associated with the content
US20070252818A1 (en) * 2006-04-28 2007-11-01 Joseph Zlotnicki Method and apparatus for efficient data input
US9152241B2 (en) 2006-04-28 2015-10-06 Zienon, Llc Method and apparatus for efficient data input
US7492445B1 (en) * 2006-06-05 2009-02-17 Cypress Semiconductor Corporation Method and apparatus for robust velocity prediction
US20100214267A1 (en) * 2006-06-15 2010-08-26 Nokia Corporation Mobile device with virtual keypad
US20080007526A1 (en) * 2006-07-10 2008-01-10 Yansun Xu Optical navigation sensor with variable tracking resolution
US7728816B2 (en) 2006-07-10 2010-06-01 Cypress Semiconductor Corporation Optical navigation sensor with variable tracking resolution
US8913003B2 (en) 2006-07-17 2014-12-16 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer using a projection marker system
US20080012824A1 (en) * 2006-07-17 2008-01-17 Anders Grunnet-Jepsen Free-Space Multi-Dimensional Absolute Pointer Using a Projection Marker System
US7536384B2 (en) 2006-09-14 2009-05-19 Veveo, Inc. Methods and systems for dynamically rearranging search results into hierarchically organized concept clusters
US20090198688A1 (en) * 2006-09-14 2009-08-06 Veveo, Inc. Methods and systems for dynamically rearranging search results into hierarchically organized concept clusters
US8037071B2 (en) 2006-09-14 2011-10-11 Veveo, Inc. Methods and systems for dynamically rearranging search results into hierarchically organized concept clusters
US10025869B2 (en) 2006-09-14 2018-07-17 Veveo, Inc. Methods and systems for dynamically rearranging search results into hierarchically organized concept clusters
US8799804B2 (en) * 2006-10-06 2014-08-05 Veveo, Inc. Methods and systems for a linear character selection display interface for ambiguous text input
US20110185306A1 (en) * 2006-10-06 2011-07-28 Veveo, Inc. Methods and Systems for a Linear Character Selection Display Interface for Ambiguous Text Input
US9250805B2 (en) * 2006-10-06 2016-02-02 Veveo, Inc. Methods and systems for a linear character selection display interface for ambiguous text input
US20150095832A1 (en) * 2006-10-06 2015-04-02 Veveo, Inc. Methods and Systems for a Linear Character Selection Display Interface for Ambiguous Text Input
US20080086704A1 (en) * 2006-10-06 2008-04-10 Veveo, Inc. Methods and systems for a Linear Character Selection Display Interface for Ambiguous Text Input
US7925986B2 (en) * 2006-10-06 2011-04-12 Veveo, Inc. Methods and systems for a linear character selection display interface for ambiguous text input
WO2008045690A2 (en) * 2006-10-06 2008-04-17 Veveo, Inc. Linear character selection display interface for ambiguous text input
WO2008045690A3 (en) * 2006-10-06 2008-06-12 Veveo Inc Linear character selection display interface for ambiguous text input
US8078884B2 (en) 2006-11-13 2011-12-13 Veveo, Inc. Method of and system for selecting and presenting content based on user identification
US20080209229A1 (en) * 2006-11-13 2008-08-28 Veveo, Inc. Method of and system for selecting and presenting content based on user identification
US20080252596A1 (en) * 2007-04-10 2008-10-16 Matthew Bell Display Using a Three-Dimensional vision System
US9176598B2 (en) 2007-05-08 2015-11-03 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer with improved performance
US8886642B2 (en) 2007-05-25 2014-11-11 Veveo, Inc. Method and system for unified searching and incremental searching across and within multiple documents
US20080313174A1 (en) * 2007-05-25 2008-12-18 Veveo, Inc. Method and system for unified searching across and within multiple documents
US20080313574A1 (en) * 2007-05-25 2008-12-18 Veveo, Inc. System and method for search with reduced physical interaction requirements
US8826179B2 (en) 2007-05-25 2014-09-02 Veveo, Inc. System and method for text disambiguation and context designation in incremental search
US8296294B2 (en) 2007-05-25 2012-10-23 Veveo, Inc. Method and system for unified searching across and within multiple documents
US8549424B2 (en) 2007-05-25 2013-10-01 Veveo, Inc. System and method for text disambiguation and context designation in incremental search
US8429158B2 (en) 2007-05-25 2013-04-23 Veveo, Inc. Method and system for unified searching and incremental searching across and within multiple documents
US20100190548A1 (en) * 2007-06-22 2010-07-29 Wms Gaming Inc. Wagering game machine with virtual input device
US8246456B2 (en) * 2007-06-22 2012-08-21 Wms Gaming Inc. Wagering game machine with virtual input device
US8230367B2 (en) 2007-09-14 2012-07-24 Intellectual Ventures Holding 67 Llc Gesture-based user interactions with status indicators for acceptable inputs in volumetric zones
US20090077504A1 (en) * 2007-09-14 2009-03-19 Matthew Bell Processing of Gesture-Based User Interactions
US9811166B2 (en) 2007-09-14 2017-11-07 Intellectual Ventures Holding 81 Llc Processing of gesture-based user interactions using volumetric zones
US9058058B2 (en) 2007-09-14 2015-06-16 Intellectual Ventures Holding 67 Llc Processing of gesture-based user interactions activation levels
US20090083035A1 (en) * 2007-09-25 2009-03-26 Ritchie Winson Huang Text pre-processing for text-to-speech generation
US8010895B2 (en) * 2007-10-24 2011-08-30 E-Lead Electronic Co., Ltd. Method for correcting typing errors according to character layout positions on a keyboard
US20100253636A1 (en) * 2007-10-24 2010-10-07 Stephen Chen Method for correcting typing errors according to character layout positions on a keyboard
US8810803B2 (en) 2007-11-12 2014-08-19 Intellectual Ventures Holding 67 Llc Lens system
US8159682B2 (en) 2007-11-12 2012-04-17 Intellectual Ventures Holding 67 Llc Lens system
US20090251685A1 (en) * 2007-11-12 2009-10-08 Matthew Bell Lens System
US9229107B2 (en) 2007-11-12 2016-01-05 Intellectual Ventures Holding 81 Llc Lens system
US8345008B2 (en) * 2007-12-10 2013-01-01 Samsung Electronics Co., Ltd. Apparatus and method for providing adaptive on-screen keyboard
US20090146957A1 (en) * 2007-12-10 2009-06-11 Samsung Electronics Co., Ltd. Apparatus and method for providing adaptive on-screen keyboard
US9070262B2 (en) 2007-12-31 2015-06-30 Apple Inc. Tactile feedback in an electronic device
US9520037B2 (en) 2007-12-31 2016-12-13 Apple Inc. Tactile feedback in an electronic device
US8754759B2 (en) 2007-12-31 2014-06-17 Apple Inc. Tactile feedback in an electronic device
US20090167508A1 (en) * 2007-12-31 2009-07-02 Apple Inc. Tactile feedback in an electronic device
US8373549B2 (en) * 2007-12-31 2013-02-12 Apple Inc. Tactile feedback in an electronic device
US8259163B2 (en) 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
US9247236B2 (en) 2008-03-07 2016-01-26 Intellectual Ventures Holdings 81 Llc Display with built in 3D sensing capability and gesture control of TV
US20090237361A1 (en) * 2008-03-18 2009-09-24 Microsoft Corporation Virtual keyboard based activation and dismissal
US8619036B2 (en) * 2008-03-18 2013-12-31 Microsoft Corporation Virtual keyboard based activation and dismissal
US8358277B2 (en) * 2008-03-18 2013-01-22 Microsoft Corporation Virtual keyboard based activation and dismissal
US20090241437A1 (en) * 2008-03-19 2009-10-01 Wolfgang Steinle Embedding unit for display devices
US8307591B2 (en) * 2008-03-19 2012-11-13 Brainlab Ag Embedding unit for display devices
US20110090151A1 (en) * 2008-04-18 2011-04-21 Shanghai Hanxiang (Cootek) Information Technology Co., Ltd. System capable of accomplishing flexible keyboard layout
US9323345B2 (en) * 2008-04-18 2016-04-26 Shanghai Chule (Cootek) Information Technology Co., Ltd. System capable of accomplishing flexible keyboard layout
US20090289188A1 (en) * 2008-05-20 2009-11-26 Everspring Industry Co., Ltd. Method for controlling an electronic device through infrared detection
US8595218B2 (en) 2008-06-12 2013-11-26 Intellectual Ventures Holding 67 Llc Interactive display management systems and methods
US20100121866A1 (en) * 2008-06-12 2010-05-13 Matthew Bell Interactive display management systems and methods
US9152258B2 (en) * 2008-06-19 2015-10-06 Neonode Inc. User interface for a touch screen
US8106749B2 (en) 2008-07-14 2012-01-31 Sony Ericsson Mobile Communications Ab Touchless control of a control device
US20100007511A1 (en) * 2008-07-14 2010-01-14 Sony Ericsson Mobile Communications Ab Touchless control of a control device
US8165881B2 (en) 2008-08-29 2012-04-24 Honda Motor Co., Ltd. System and method for variable text-to-speech with minimized distraction to operator of an automotive vehicle
US20100057464A1 (en) * 2008-08-29 2010-03-04 David Michael Kirsch System and method for variable text-to-speech with minimized distraction to operator of an automotive vehicle
US20100057465A1 (en) * 2008-09-03 2010-03-04 David Michael Kirsch Variable text-to-speech for automotive application
US9588681B2 (en) 2008-09-29 2017-03-07 Microsoft Technology Licensing, Llc Glow touch feedback for virtual input devices
US20100081476A1 (en) * 2008-09-29 2010-04-01 Microsoft Corporation Glow touch feedback for virtual input devices
US8750938B2 (en) 2008-09-29 2014-06-10 Microsoft Corporation Glow touch feedback for virtual input devices
US20100124949A1 (en) * 2008-11-14 2010-05-20 Sony Ericsson Mobile Communications Ab Portable communication device and remote motion input device
US8503932B2 (en) * 2008-11-14 2013-08-06 Sony Mobile Comminications AB Portable communication device and remote motion input device
US8423916B2 (en) * 2008-11-20 2013-04-16 Canon Kabushiki Kaisha Information processing apparatus, processing method thereof, and computer-readable storage medium
US20100125787A1 (en) * 2008-11-20 2010-05-20 Canon Kabushiki Kaisha Information processing apparatus, processing method thereof, and computer-readable storage medium
US20100134409A1 (en) * 2008-11-30 2010-06-03 Lenovo (Singapore) Pte. Ltd. Three-dimensional user interface
US20100214226A1 (en) * 2009-02-23 2010-08-26 International Business Machines Corporation System and method for semi-transparent display of hands over a keyboard in real-time
US8140970B2 (en) * 2009-02-23 2012-03-20 International Business Machines Corporation System and method for semi-transparent display of hands over a keyboard in real-time
US20100235786A1 (en) * 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
US8375311B2 (en) * 2009-03-24 2013-02-12 Disney Enterprises, Inc. System and method for determining placement of a virtual object according to a real-time performance
US20100245349A1 (en) * 2009-03-24 2010-09-30 Disney Enterprises, Inc. System and method for determining placement of a virtual object according to a real-time performance
US20100251161A1 (en) * 2009-03-24 2010-09-30 Microsoft Corporation Virtual keyboard with staggered keys
US20100302165A1 (en) * 2009-05-26 2010-12-02 Zienon, Llc Enabling data entry based on differentiated input objects
US9274547B2 (en) * 2009-07-23 2016-03-01 Hewlett-Packard Development Compamy, L.P. Display with an optical sensor
US20120120038A1 (en) * 2009-07-23 2012-05-17 Mccarthy John P Display With An Optical Sensor
US9166714B2 (en) 2009-09-11 2015-10-20 Veveo, Inc. Method of and system for presenting enriched video viewing analytics
US8639414B2 (en) * 2009-12-25 2014-01-28 Honda Access Corp. Operation apparatus for on-board devices in automobile
US20110160933A1 (en) * 2009-12-25 2011-06-30 Honda Access Corp. Operation apparatus for on-board devices in automobile
US20110164032A1 (en) * 2010-01-07 2011-07-07 Prime Sense Ltd. Three-Dimensional User Interface
US20110191332A1 (en) * 2010-02-04 2011-08-04 Veveo, Inc. Method of and System for Updating Locally Cached Content Descriptor Information
US9703779B2 (en) 2010-02-04 2017-07-11 Veveo, Inc. Method of and system for enhanced local-device content discovery
US20110191516A1 (en) * 2010-02-04 2011-08-04 True Xiong Universal touch-screen remote controller
US20110191331A1 (en) * 2010-02-04 2011-08-04 Veveo, Inc. Method of and System for Enhanced Local-Device Content Discovery
US20110242054A1 (en) * 2010-04-01 2011-10-06 Compal Communication, Inc. Projection system with touch-sensitive projection image
US9678662B2 (en) 2010-04-23 2017-06-13 Handscape Inc. Method for detecting user gestures from alternative touchpads of a handheld computerized device
US9891821B2 (en) 2010-04-23 2018-02-13 Handscape Inc. Method for controlling a control region of a computerized device from a touchpad
US9639195B2 (en) 2010-04-23 2017-05-02 Handscape Inc. Method using finger force upon a touchpad for controlling a computerized system
US9891820B2 (en) 2010-04-23 2018-02-13 Handscape Inc. Method for controlling a virtual keyboard from a touchpad of a computerized device
US9310905B2 (en) 2010-04-23 2016-04-12 Handscape Inc. Detachable back mounted touchpad for a handheld computerized device
WO2011133986A1 (en) * 2010-04-23 2011-10-27 Luo Tong Method for user input from the back panel of a handheld computerized device
US9529523B2 (en) 2010-04-23 2016-12-27 Handscape Inc. Method using a finger above a touchpad for controlling a computerized system
US9542032B2 (en) 2010-04-23 2017-01-10 Handscape Inc. Method using a predicted finger location above a touchpad for controlling a computerized system
US9311724B2 (en) 2010-04-23 2016-04-12 Handscape Inc. Method for user input from alternative touchpads of a handheld computerized device
US9430147B2 (en) 2010-04-23 2016-08-30 Handscape Inc. Method for user input from alternative touchpads of a computerized system
WO2011149515A1 (en) * 2010-05-24 2011-12-01 Will John Temple Multidirectional button, key, and keyboard
US9158375B2 (en) 2010-07-20 2015-10-13 Apple Inc. Interactive reality augmentation for natural interaction
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US20120038542A1 (en) * 2010-08-16 2012-02-16 Ken Miyashita Information Processing Apparatus, Information Processing Method and Program
US9218027B2 (en) * 2010-08-16 2015-12-22 Sony Corporation Information processing apparatus, information processing method and program
US20120059647A1 (en) * 2010-09-08 2012-03-08 International Business Machines Corporation Touchless Texting Exercise
US8577915B2 (en) 2010-09-10 2013-11-05 Veveo, Inc. Method of and system for conducting personalized federated search and presentation of results therefrom
US9058390B2 (en) 2010-09-10 2015-06-16 Veveo, Inc. Method of and system for conducting personalized federated search and presentation of results therefrom
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US20130275907A1 (en) * 2010-10-14 2013-10-17 University of Technology ,Sydney Virtual keyboard
US9329777B2 (en) * 2010-10-14 2016-05-03 Neopad, Inc. Method and system for providing background contents of virtual key input device
US20120274658A1 (en) * 2010-10-14 2012-11-01 Chung Hee Sung Method and system for providing background contents of virtual key input device
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US20120182215A1 (en) * 2011-01-18 2012-07-19 Samsung Electronics Co., Ltd. Sensing module, and graphical user interface (gui) control apparatus and method
US9733711B2 (en) * 2011-01-18 2017-08-15 Samsung Electronics Co., Ltd. Sensing module, and graphical user interface (GUI) control apparatus and method
US9454225B2 (en) 2011-02-09 2016-09-27 Apple Inc. Gaze-based display control
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US9342146B2 (en) 2011-02-09 2016-05-17 Apple Inc. Pointing-based display interaction
US20120229447A1 (en) * 2011-03-08 2012-09-13 Nokia Corporation Apparatus and associated methods
US9035940B2 (en) * 2011-03-08 2015-05-19 Nokia Corporation Apparatus and associated methods
US8963883B2 (en) 2011-03-17 2015-02-24 Symbol Technologies, Inc. Touchless interactive display system
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
US20120242659A1 (en) * 2011-03-25 2012-09-27 Hon Hai Precision Industry Co., Ltd. Method of controlling electronic device via a virtual keyboard
US9504920B2 (en) 2011-04-25 2016-11-29 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
EP2725458A4 (en) * 2011-06-23 2015-05-13 Fujitsu Ltd Information processing device, input control method, and input control program
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9448724B2 (en) * 2011-07-11 2016-09-20 International Business Machines Corporation Dynamically customizable touch screen keyboard for adapting to user physiology
US20130019191A1 (en) * 2011-07-11 2013-01-17 International Business Machines Corporation Dynamically customizable touch screen keyboard for adapting to user physiology
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US8248385B1 (en) 2011-09-13 2012-08-21 Google Inc. User inputs of a touch sensitive device
US20130249821A1 (en) * 2011-09-27 2013-09-26 The Board of Trustees of the Leland Stanford, Junior, University Method and System for Virtual Keyboard
US8902161B2 (en) * 2012-01-12 2014-12-02 Fujitsu Limited Device and method for detecting finger position
EP2615532A3 (en) * 2012-01-12 2015-04-01 Fujitsu Limited Device and method for detecting finger position
US20130181904A1 (en) * 2012-01-12 2013-07-18 Fujitsu Limited Device and method for detecting finger position
US9600078B2 (en) 2012-02-03 2017-03-21 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
WO2013144807A1 (en) * 2012-03-26 2013-10-03 Primesense Ltd. Enhanced virtual touchpad and touchscreen
US20130283208A1 (en) * 2012-03-26 2013-10-24 Primesense Ltd. Gaze-enhanced virtual touchscreen
US9377863B2 (en) * 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US8850349B2 (en) 2012-04-06 2014-09-30 Google Inc. Smart user-customized graphical keyboard
US9069168B2 (en) * 2012-05-04 2015-06-30 Yong Yan Means for setting separation walls with predetermined heights and shapes on keypads to prevent unintended key hits
US20130293398A1 (en) * 2012-05-04 2013-11-07 Yong Yan Separation walls on keypads
EP2860611A4 (en) * 2012-06-08 2016-03-02 Kmt Global Inc User interface method and apparatus based on spatial location recognition
US8655021B2 (en) * 2012-06-25 2014-02-18 Imimtek, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US8830312B2 (en) 2012-06-25 2014-09-09 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching within bounded regions
US9098739B2 (en) 2012-06-25 2015-08-04 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching
US9111135B2 (en) 2012-06-25 2015-08-18 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera
US9310891B2 (en) 2012-09-04 2016-04-12 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US20140098025A1 (en) * 2012-10-09 2014-04-10 Cho-Yi Lin Portable electrical input device capable of docking an electrical communication device and system thereof
US9250748B2 (en) * 2012-10-09 2016-02-02 Cho-Yi Lin Portable electrical input device capable of docking an electrical communication device and system thereof
US9129155B2 (en) 2013-01-30 2015-09-08 Aquifi, Inc. Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
EP2775379A1 (en) * 2013-03-05 2014-09-10 Funai Electric Co., Ltd. Projector
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US20140354602A1 (en) * 2013-04-12 2014-12-04 Impression.Pi, Inc. Interactive input system and method
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
CN103558948A (en) * 2013-10-31 2014-02-05 中山大学 Man-machine interaction method applied to virtual optical keyboard
US20150153950A1 (en) * 2013-12-02 2015-06-04 Industrial Technology Research Institute System and method for receiving user input and program storage medium thereof
US9857971B2 (en) * 2013-12-02 2018-01-02 Industrial Technology Research Institute System and method for receiving user input and program storage medium thereof
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US9933854B2 (en) 2015-01-16 2018-04-03 Samsung Electronics Co., Ltd. Virtual input device and method for receiving user input using the same
US20160259402A1 (en) * 2015-03-02 2016-09-08 Koji Masuda Contact detection apparatus, projector apparatus, electronic board apparatus, digital signage apparatus, projector system, and contact detection method
CN104951073A (en) * 2015-06-19 2015-09-30 济南大学 Gesture interaction method based on virtual interface
US9971457B2 (en) 2015-06-26 2018-05-15 Intel Corporation Audio augmentation of touch detection for surfaces
WO2016209520A1 (en) * 2015-06-26 2016-12-29 Intel Corporation Audio augmentation of touch detection for surfaces

Also Published As

Publication number Publication date Type
EP1332488B1 (en) 2010-09-15 grant
EP1332488A4 (en) 2006-06-14 application
DE60143094D1 (en) 2010-10-28 grant
WO2001059975A3 (en) 2002-01-31 application
CN1439151A (en) 2003-08-27 application
EP1332488A2 (en) 2003-08-06 application
JP2004500657A (en) 2004-01-08 application
US6614422B1 (en) 2003-09-02 grant
WO2001059975A2 (en) 2001-08-16 application
CN1232943C (en) 2005-12-21 grant

Similar Documents

Publication Publication Date Title
US6603867B1 (en) Three-dimensional object identifying system
US6791531B1 (en) Device and method for cursor motion control calibration and object selection
US7313255B2 (en) System and method for optically detecting a click event
US6008798A (en) Method of determining an object&#39;s position and associated apparatus
US4906843A (en) Combination mouse, optical scanner and digitizer puck
US20080259053A1 (en) Touch Screen System with Hover and Click Input Methods
US20090278915A1 (en) Gesture-Based Control System For Vehicle Interfaces
US7305368B2 (en) Virtual data entry device and method for input of alphanumeric and other data
US7729515B2 (en) Optical navigation apparatus using fixed beacons and a centroid sensing device
US20050168437A1 (en) Processing pose data derived from the pose of an elongate object
US20030048280A1 (en) Interactive environment using computer vision and touchscreens
US20120274550A1 (en) Gesture mapping for display device
US6798401B2 (en) Optical system for inputting pointer and character data into electronic equipment
US20100177035A1 (en) Mobile Computing Device With A Virtual Keyboard
US20120078614A1 (en) Virtual keyboard for a non-tactile three dimensional user interface
US7038659B2 (en) Symbol encoding apparatus and method
US5945981A (en) Wireless input device, for use with a computer, employing a movable light-emitting element and a stationary light-receiving element
US6281878B1 (en) Apparatus and method for inputing data
US20100001963A1 (en) Multi-touch touchscreen incorporating pen tracking
US20100001962A1 (en) Multi-touch touchscreen incorporating pen tracking
US20050091297A1 (en) Coordinate input apparatus and control method thereof, coordinate input pointing tool, and program
US20040021645A1 (en) Coordinate input apparatus, control method thereof, and program
US20040125147A1 (en) Device and method for generating a virtual keyboard/display
US20080259052A1 (en) Optical touch control apparatus and method thereof
US20090322673A1 (en) Free fingers typing technology

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034766/0509

Effective date: 20141014