EP1250698A2 - Human gestural input device with motion and pressure - Google Patents

Human gestural input device with motion and pressure

Info

Publication number
EP1250698A2
EP1250698A2 EP00926157A EP00926157A EP1250698A2 EP 1250698 A2 EP1250698 A2 EP 1250698A2 EP 00926157 A EP00926157 A EP 00926157A EP 00926157 A EP00926157 A EP 00926157A EP 1250698 A2 EP1250698 A2 EP 1250698A2
Authority
EP
European Patent Office
Prior art keywords
pointing device
motion
computer
device
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP00926157A
Other languages
German (de)
French (fr)
Other versions
EP1250698A4 (en
Inventor
John Warren Stringer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Stringer John Warren
Original Assignee
John Warren Stringer
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US13019199P priority Critical
Priority to US130191P priority
Application filed by John Warren Stringer filed Critical John Warren Stringer
Priority to PCT/US2000/010579 priority patent/WO2000063874A1/en
Publication of EP1250698A2 publication Critical patent/EP1250698A2/en
Publication of EP1250698A4 publication Critical patent/EP1250698A4/en
Application status is Withdrawn legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H01BASIC ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES; ELECTRIC SOLID STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H01L41/00Piezo-electric devices in general; Electrostrictive devices in general; Magnetostrictive devices in general; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L41/08Piezo-electric or electrostrictive devices
    • H01L41/113Piezo-electric or electrostrictive devices with mechanical input and electrical output, e.g. generators, sensors
    • H01L41/1132Sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0382Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC

Abstract

A hand-held computer input pointing device (100) has at least one motion detector (314), the at least one motion detector being capable of detecting motion in at least three dimensions. At least one pressure sensor (316) is capable of sensing pressure quantitatively. The input device is operable within a hand of a user to transmit signals from the motion detector and the pressure sensor, without contacting a base, work surface or pad. The pointing device may, for example, be used to control a computer (701) executing a program to display a movable virtual puppet (710).

Description

HUMAN GESTURAL INPUT DEVICE WITH MOTION AND PRESSURE

FIELD OF THE j-NVENTION The present invention relates the field of input devices for computers.

DESCRIPTION OF THE RELATED ART

Computer input pointing devices are well known. A typical computer input pointing device contains at least two sensors for sensing motion in at least two directions, such as x and y (forward-backward and left-right). The pointing device also has at least one actuator for causing the pointing device to transmit a command signal (typically referred to as a "click") to a computer."

The most common pointing device for a desktop computer or workstation is a mouse. The mouse may have a ball on its underside. Motion of the ball causes rotation of one or more of a plurality of rotation sensors adjacent to the ball. The number of rotations determine the magnitude of the motion in the x or y direction. The most common mouse type includes two buttons for sending commands known as "left-click" and "right-click. Other types of mice include optical sensors that count markings on a specially marked mouse pad to determine an amount of movement.

Another common pointing device used primarily for computer games is a joystick. The joystick has a lever that is tilted in a forward-backward or left-right direction, each of which is sensed independently. A button is typically provided on the end of the lever for transmitting a command signal. Some joysticks also allow rotation of the lever, which is also sensed.

A variety of alternative pointing devices have been developed. Alternative pointing devices are typically used on laptop computers. An early pointing device ύsδd on laptop computers is the track ball. The track ball functions like an upside-down mouse. Instead of moving the device, the ball is rotated directly.

Many laptop computers include a miniature joystick that is positioned on the home row of the keyboard. Another common pointing device in laptop computers is a touch pad. The touch pad is a rectangular device that is sensitive to touch. Left and right sliding motion of a finger on the touch pad is detected. The touch pad also senses when it is struck, and produces a "click" signal. Although conventional pointing devices are suitable for locating a point on an x-y grid, and transmitting a simple single-valued command signal, conventional pointing devices leave much to be desired for controlling application programs that require more complex inputs.

SUMMARY OF THE INVENTION

One aspect of the present invention is a hand-held computer input pointing device, comprising at least one motion detector. The at least one motion detector is capable of detecting motion in at least three dimensions. At least one pressure sensor is capable of sensing pressure quantitatively. The input device is operable within a hand of a user to transmit signals from the motion detector and the pressure sensor, without contacting a base, work surface or pad.

Another aspect of the invention is a method for operating a computer having an output device, comprising the steps of: receiving a plurality of signals from a computer input pointing device, the signals representing quantifiable pressure and motion in at least three dimensions; and outputting a video, audio or tactile output signal to the output device, the output signal having at least two characteristics that are capable of being varied separately from each other.

Still another aspect of the invention is a method for operating a computer input pointing device, comprising the steps of: moving the pointing device; transmitting at least three motion signals from the pointing device to a computer, the motion signals including signals representing translations in one or more dimension, or rotations in one or more dimensions, or a combination of translations and rotations; squeezing at least one portion of the pointing device; and transmitting quantifiable pressure signals from the pointing device to the computer.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is an isometric view of an exemplary squeezable pointing device according to the present invention.

FIG. 2 is an exploded view of the pointing device of FIG. 1.

FIG. 3A is an enlarged view of the origami sensor of FIG. 2.

FIG. 3B is a plan view of the sensor of FIG. 3 A, prior to folding.

FIG. 4A is a plan view of a tetrahedral origami sensor in a flattened state.

FIG. 4B is an isometric view of the sensor of FIG. 4A, folded. FIG. 5A is a plan view of an alternative cubic origami sensor in a flattened state.

FIG. 5B is an isometric view of the sensor of FIG. 5A, folded.

FIG. 6 is an isometric view of a tetrahedral edge sensor.

FIG. 7 is an elevation view of a computer system including the pointing device of FIG. 1.

FIG. 8 is a diagram of a range of motion tree for an application program executed on the computer shown in FIG. 7.

FIG. 9 shows the software layers for the application executed on the computer shown in FIG. 7.

FIG. 10 is a flow chart of a method for operating the virtual puppet of FIG. 7.

FIGS. 11-13 are block diagrams of the software layer structure shown in FIG. 9.

FIG. 14 is a flow chart of a method for operating the pointing device of FIG. 2.

DETAILED DESCRIPTION

FIG. 1 shows an exemplary computer input pointing device 100 according to the present invention. Pointing device 100 is capable of detecting motion and providing a quantifiable measure of squeeze pressure. The input device 100 is operable in a hand of a user, without contacting a base, work surface, or pad. Thus, the input device 100 is "free floating," in that it may be operated in any position without regard to the presence of a base, work surface, or pad. Pointing device 100 has a squeezable, deformable elastomeric cover 102. Inside the cover 102, the pointing device 100 has an internal motion detector that may measure acceleration in x, y and z planes, or measure pitch, yaw and roll. Squeezing the package increases pressure that translates into a pressure signal that is sent by the pointing device 100 to the computer (shown in FIG. 7) along with the motion data.

FIG. 2 is an exploded view of the computer input pointing device 100, with the squeezable elastomeric covering 102 partially removed. The squeezable covering 102 may be formed from natural or synthetic rubber or a variety of polymeric materials, which may be cut or injection molded. Cover 102 has an internal cavity 104 for containing sensors. An optional sensor enclosure 106 may be provided inside of cover 102. The sensor enclosure 106 may be sealed to protect the sensors from dust and moisture, allowing use of an unsealed cover 102. The deformable cover 102, with or without the sensor enclosure 106, allows the sensor to move freely, and compresses internal sensor. In one embodiment, the squeezable cover is made from a "SQEESH™" ball by the Toysmim Company of Kent, WA USA. "SQEESH™" balls are available in a variety of shapes and surface types, including smooth spherical, spherical with craters, and star shaped. A variety of shapes may be used to provide users with the grip of their choice or hints as to what object that the device controls.

Pointing device 100 has at least one motion detector 108, which is capable of detecting motion in at least three dimensions. Pointing device 100 also includes at least one pressure sensor capable of sensing pressure quantitatively. In the example, the pressure sensing capability is also provided by the sensor 108, which is described in greater detail below. The at least one motion detector may, for example, include an accelerometer or a piezoelectric device.

As best seen in FIG. 3 A, the exemplary piezoelectric device 108 includes at least three piezoelectric strips 301-303 or flaps oriented in three orthogonal directions. More specifically, the exemplary device 108 is a single film sensor having three strips 301-303 folded origami style, as shown in FIG. 3A, to form a piezo thin film with six sensors.

FIG. 3B is a view of the device 108 with six sensors 314, 316. The device is formed on a substrate 300 and is shown laid flat before the origami folds along dashed lines 312. Optional weight W1-W3 may be attached to the end or center of each flap 301-303, respectively. Preferably, each of the three piezoelectric strips 301-303 has at least two sections, including a weighted section 314 and an unweighted section 316; so as to discriminate motion load from pressure load.

The piezo film substrate 300 is plastic and allows two strips of piezo material to measure bend of the substrate by the relative tension between two strips. One piezo sensor 316 is provided for pressure discrimination. The pressure on the deformable covering 102 bends the material of the piezo pressure sensor 316, producing a measurable voltage. Another piezo sensor 314 for motion discrimination has a weight W1-W3 placed at the middle or end of the sensor. If each of the three piezoelectric strips 314 has a respective weight; moving the pointing device 100 induces an inertial force. The weight W1-W3 moves against the material of the motion sensor 314, causing it to bend at a different rate than pressure sensor 316.

Cuts 318 on the exemplary piezo film substrate 300 allow each sensor flap 314, 316 to bend separately. The dotted lines 312 show folds where the flat sensor strip 300 is origami folded in 90 degree bends. The folded origami flaps 301-303 are affixed to a support assembly 320 (shown in FIG. 3A). The support assembly 320 holds the origami flaps 301-303 in place. The leads L from thin film sensor are connected to optional internal multiplexing circuitry 110 that connects the sensor circuitry to an external lead 112.

Referring again to FIG. 1, the sensors output data representing four parameters (three motion and one pressure). Preferably, a multiplexer 110 receives output signals from the motion detector and the pressure sensor 108, and outputs a multiplexed signal for transmission to a computer (shown in FIG. 7). The output of the multiplexer is connected to a lead 112. The lead 112 extends through a channel 103 in the elastomeric covering 102. The opposite end of the lead 112 has a suitable connector 114 attached to it, which may be a 0.32-centimeter (1/8 inch) miniplug, a Universal Serial Bus (USB) connector, a serial (RS-232) port, an "ETHERNET™" port, a MIDI port for connection to a musical instrument, a PDA port, a cellular phone port or other suitable connector.

In a system having a plurality of similar pointing devices 100, a hub capable of accepting inputs from multiple input devices may be provided to interface the plurality of pointing devices 100 to a single computer. For example, if there are multiple pointing devices having USB connectors, a USB hub 116 (connected to lead 118 and USB connector 120) may be used to connect the pointing devices to a single computer. Alternatively, the pointing device may omit the multiplexer, and output four analog signals from four respective output terminals. The pointing device 100 may have a wireless transmitter for communicating with the computer.

The lead 112 from the sensing package either connects one data line (carrying time or frequency multiplexed data), or multiple data lines each carrying a respective sensor's output. The connector (such as 1/8" miniplug) 114 optionally connects directly to the computer (FIG. 7) through a microphone input eliminating the need for a hub. The pointing device 100 may use the audio software driver or communicate through the optional hub 116 for controlling multiple puppets (as described below). The connector 114 may connect to a different computer port such as a USB or serial port, eliminating the need for a hub. Although FIG. 2 shows a two-input version, the system may include any number of inputs.

Any of the sensor types described herein may be used with clock shifting to the next lead. An additional clock cycle may be provided for subsequent processing of an optional unique identification number (described below). The sensors provide an analog signal that may be modulated with a single carrier frequency, or the signal from each lead may be modulated with a respectively different carrier frequency.

An optional analog to digital converter (ADC) may be provided for each of the analog sensor leads. In this case, a clock shift to the next lead sends a digital packet. For example, with eight bits per lead, which for 6 sensors yields six bytes per sample cycle.

An optional identifier (LD) circuit may be provided. The ID circuit may be a register or a read only memory (ROM) device, EEPROM, or the like. The unique identifier may be hard encoded at the factory (e.g., using a ROM), or the identifier may be downloaded from a computer into a flash EEPROM in the pointing device 100. As explained below with reference to FIG. 7, by providing a unique identifier for each pointing device unit 100, more than one unit may be used at the same time. The device identifier can function like an encryption key, to protect the user's preferences or to securely identify the user.

The output of the LD circuit and the output from one of the sensors are output to the multiplexing circuitry 110. The analog input may be time-domain multiplexed with the ID waveform. Alternatively, if the analog sensor signals are converted to digital form before transmission to the multiplexer, a digital input packet may be combined with ID information to form a larger packet including both. The pointing device 100 may contain additional identifier circuitry to add a port identifier number separate from the pointing device identifier number.

The analog waveform may have a separate carrier frequency or may be time domain multiplexed or interleaved with each sample, either on a bit, byte, or packet basis. Alternatively, if a digital packet format is used, the sensor data and ID information may be added to a multiplexing packet to form a larger packet including both types of data.

The pointing device 100 may obtain power from the computer (FIG. 7), from an internal battery, or by drawing power directly from the energy created by the sensor from the piezoelectric effect.

ALTERNATIVE SENSORS

The piezoelectric device may be an origami sensor having cut coils, and may have either a cubic or tetrahedral shape. Preferably, the cut coils on each face of the motion detector are attached to a single weight at a center of the motion detector. Optionally, each cut coil on each face of the motion detector has its own weight attached.

FIGS. 5 A and 5B show a sensor system 500 including origami cutouts, which may be used in place of the sensor 108 shown in FIG. 2. A thin sensor film 500 having six faces 501- 506 is folded to form a cube (although a tetrahedron, shown in FIGS. 4A and 4B, or other three- dimensional shapes may be used in alternative embodiments). A spiral pattern 511-516 is cut into each face of thin sensor film 501-506, respectively. Each coil can be pressed towards the center (e.g., Cl, C3 and C6) and attached to a weight that is suspended at the center of the folded shape.

When the pointing device is moved, the weight applies forces to each of the coils eliciting a voltage. For the cube 500, each face 501-506 has a corresponding opposite face measuring force with an inverse effect on voltage.

For the tetrahedron shown in FIGS. 4A and 4B, each side faces three other sides, measuring an inverse effect on their faces. Additional circuitry (not shown) may be included in the exemplary sensor configuration. An analog sample and hold is provided for each of the analog sensor leads. For the exemplary origami sensor of FIGS. 2, 3A and 3B, the leads are:

1. A free - for pressure discrimination

2. A weighted - for motion discrimination

3. B free - pressure

4. B weighted - motion

5. C free - pressure

6. C weighted - motion

For a tetrahedral sensor (FIGS. 4 A and 4B), the leads are:

1. A tension

2. B tension

3. C tension

4. D tension

For a cubic sensor (FIGS. 5A and 5B), the leads are:

1. A0 tension

2. Al tension

3. BO tension

4. B2 tension

5. CO tension

6. tension

FIG. 6 shows a tetrahedral edge sensor 600. Sensor 600 has six edges to detect pressure directly from a squeeze of the deformable elastomeric material of cover 102. Optionally, sensor 600 may have weighted edges 602 to detect motion through inertial force whenever the pointing device 100 is moved. The weights 610 increase the momentum of the sensors 602, so that the deformation of the sensors is exaggerated. Alternatively, weights may be placed at each node 608.

Sensor 600 is used in conjunction with an application driver that can discriminate pressure from motion. The driver can discriminate pitch, yaw, and roll. A plurality of signal leads 604 are provided, one for each edge of the tetrahedron 600. The outputs 604 of the sensor either connect to multiplexing/conversion circuitry 110 or lead out to separate analog lines. There is an input lead from each edge of tetrahedron. Optionally each connecting node 608 broadcasts a wireless signal of its state without requiring any lead wire. As shown in FIG. 6 by dotted lines 606, non-sensing leads connect each node 608 to another edge 602. The non-sensing leads 606 either follow a sensing edge 602 to another node 608 or may run directly to another node 608. The connecting nodes 608 for leads 606 and sensing edges 602 may simply be vertices for an origami tetrahedron or connecting nodes for self supporting edge sensors.

In one embodiment, a sensor similar to that used in the "GYROPOENT™" mouse may be used.

In another embodiment, a sensor such as the "ACH-04-08-05" accelerometer/shock sensor from Measurement Specialties, Inc. of Valley Forge PA may be used. The ACH-04-08-05 sensor has three piezoelectric sensing elements oriented to measure acceleration in x, y and z linear axes, and uses very low power. Alternatively, an "ACH-04-08-01" sensor by Measurement Specialties, Inc. may be used to measure linear acceleration in the x and y directions and rotation about the z axis.

In addition to the motion detection capability, pointing device 100 also includes a pressure sensing capability. An internal pressure sensor is included, which may be, for example, a thin film piezo sensor that detects a bending of a strip or a sensor such as the "MTC Express™" from Tactex Controls Inc. of Cerritos, CA. This control surface is pressure sensitive, and also senses multiple points of contact, using a fiber optic-based, pressure sensitive, Smart Fabric called "KTNOTEX™." This fabric is constructed of cellular urethane or silicone sandwiched between protective membranes. The fabric has an embedded fiber optic array that generates an optical signal when the material is touched, or when force is applied. The coordinates and area of the pressure, and its magnitude can be determined from the received signals. The material can generate a pressure profile of the shape of a hand touching it.

As noted above, the pointing device 100 may include a circuit that outputs a unique identifier for each pointing device. There are many applications for this feature. For example, the feature allows the pointing device 100 to serve a user identification and/or authentication function. A user could carry around his or her pointing device with a unique identifier output signal, and connect it to any desired computer having an Internet connection. The application program could request a remote server to download previously established software preferences or upload the user's current software preferences. This enables a person to log on at any computer connected to the Internet, access his or her software (e.g., virtual puppet and stage application), or download web bookmarks. Alternatively, the user could use the pointing device 100 as a key to access copy protected software, functioning like a parallel port dongle. Thus, the use of a pointing device 100 with a built-in unique identifier enhances the user's mobility and access to services. Further, the pointing device may be used for authentication at any computer to which it is connected. For example, because the unit uniquely identifies itself, the unit can be used in a way that allows it to bind to each individual user. For example, the unique identifier may be used in combination with a password, IP address or a user signature. For example, the user may sign his or her name using the pointing device to control movement of a virtual writing implement. The user's handwriting may be recognized using a conventional handwriting recognition algorithm, such as any of those described in U.S. Patents 5768423, 5757959, 5710916, 5649023 or 5553284, all of which are expressly incorporated by reference herein in their entireties. The combination of the unique device identifier and the user signature provides reliable authentication.

Further still, the unique identifier may be used to make the user's preferences more portable. For example, the identifier may be associated with the design (i.e., shape, color, and the like) of the cover 102. The identifier for the pointing device may also be associated with different shapes and colors of cursors. The cursor that appears on the screen may have the same shape or color as the cover 102 of the input device 100, which is associated with a known identifier. For example, a blue spherical pointing device may be associated with a round, blue cursor, a puppet shaped pointing device may be associated with a puppet shaped cursor, an alien-shaped pointing device may be associated with an alien-shaped cursor, and so on.

Alternatively, the input device packaging may be completely different from the shape and color of the computer cursor, or puppet object. If the puppet application program is stored in a server that is accessible via the Internet, or other local area network (LAN) or wide area network (WAN) or wireless services such as CDMA, the user may select from any of a plurality of different puppets from a gallery. This is one of the user preferences that may follow her around from client computer to client computer.

VIRTUAL PUPPET SYSTEM

FIG. 7 shows a computer system 700 according to another aspect of the invention. The computer system 700 includes a computer 701 having a processor 702, a memory 706, a display 708, and an input port 712.

At least one computer input pointing device 100, as described above with reference to FIG. 2, is coupled to the input port 712 of the computer 701. The pointing device 100 has at least one motion detector 108, capable of detecting motion in at least three dimensions, and at least one pressure sensor capable of sensing pressure quantitatively. A multiplexer 110 receives output signals from the motion detector 108 and the pressure sensor, and outputs a signal for transmission to the computer 701. Alternatively, the lead from each sensor may output an analog signal.

The computer 701 further comprises a storage device 704, which may be a hard disk drive, a high capacity removable disk drive (e.g., "ZIP" drive by Iomega Corporation), a read only memory (ROM), CD-ROM drive, a digital versatile disk (DVD) ROM drive, or the like. Either the memory 706 or the storage device 704 has stored therein computer program code for causing the computer to show an object 710 on the display 708. The object has a plurality of portions 710a-710d that are selectively movable in response to motion or squeezing of the pointing device 100. The exemplary object is an animated character 710 controlled by input parameters from the pointing device. Such an animated character is referred to herein as a "virtual puppet". The virtual puppet has a plurality of independently movable body parts 710a-710d. Although the example only shows movable arms and legs, any part of the virtual puppet 710 may be movable.

According to another aspect of the invention, a method is provided for controlling a computer. The method includes the steps of: receiving a plurality of signals from a computer input pointing device 100, the signals representing quantifiable pressure and motion in at least three dimensions; and outputting a video, audio or tactile output signal to the output device 100, the output signal having at least two characteristics that are capable of being varied separately from each other.

FIG. 10 shows an exemplary method for operating the computer 701. The method includes the steps of:

• receiving a plurality of signals from a computer input pointing device 100, the signals representing quantifiable pressure and motion in at least three dimensions;

• displaying a virtual puppet 710 on the display 708, the virtual puppet having at least two portions 710a-710d that are capable of being moved separately from each other; and

• showing movement of the portions 710a-710d of the virtual puppet 710 in response to the pressure and motion signals.

At step 1010, the animated character 710, figure or other object is displayed on the display 708. At step 1020, when the user squeezes pointing device 100, stejjs 1030-1050 are executed. At step 1030, the driver module 904 updates the idealized pressure value Pi and translates Pi into the normalized pressure value P. At step 1040, the application program processes the normalized pressure P into an animated figure action. At step 1050, the application updates the animated figure according to the action, and displays the result on the display 708.

At step 1060, when the user moves the pointing device 100, steps 1070-1080 are executed. At step 1070, the driver module 904 of the software updates the idealized motion vector Mxi, Myi, Mzi and translates the vector into a normalized motion vector Mx, My, Mz. At step 1080, the application program processes the motion vector Mx, My, Mz into animated figure motion, and outputs the moving figure to the display 708.

An exemplary method for operating the pointing device 100 comprises the steps of:

1. moving the pointing device 100;

2. transmitting at least three motion signals from the pointing device 100 to a computer 701, the motion signals including signals representing translations in one or more dimension, or rotations in one or more dimensions, or a combination of translations and rotations;

3. squeezing at least one portion of the pointing device 100; and

4. transmitting quantifiable pressure signals from the pointing device 100 to the computer 701.

Preferably, all parameters measured by the sensors 108 in the pointing device 100 are mapped to motions of the virtual puppet 710, such as mouth moving open and closed, the motions of limbs, or the expression of emotional intensity. The pressure measurement is particularly suitable for use in providing quantitative inputs to the computer to create results having continuously variable intensity. Thus, a light squeeze may result in a small kick, whereas a hard squeeze results in a high kick.

The example of FIG. 7 includes a hub 116 and at least a second computer input pointing device 100 capable of sensing motion in at least three dimensions and a pressure sensor. Both the first and second computer input pointing devices 100 are connected to the hub 116, which in turn is connected the input port of the computer 701. Each of the first and second computer input pointing devices 100 controls a respective virtual puppet 710 and 711.

As shown in FIG. 8, the computer program code includes a range of motion tree 800 for the virtual puppet 710. The range of motion tree 800 restricts a range of possible motion of the virtual puppet 710 about a current position of the virtual puppet. Preferably, the computer program code includes a respective unique range of motion tree for each of a plurality of users.

The exemplary range of motion tree 800 covers head, left and right shoulder, left and right elbow, left and right wrist, and left and right grip. The tree limits the range of motion from each node. For example, the nodes may be delimited as follows:

1. Head Hxyz

2. Shoulder Sxyz 3. Elbow Exyz

4. Wrist Wxyz

The limits may be expressed in rectangular coordinates, such as (x0,y0,z0), (xl,yl,zl). Alternatively, the limits may be expressed as delta ranges. For some objects, it may be more convenient to express the limits in polar coordinates, which have a 1:1 mapping to rectangular coordinates.

The range of motion tree 800 can express a hierarchy of motions. For example, in the case of the head, the range of motion may be express as: (HxO, Hxl), (HyO, Hyl), (HzO.Hzl) => H* where H* denotes range of motion

For the shoulder , the range of motion may be express as: (SxO, Sxl), (SyO, Syl), (SzO.Szl) => S*

One of ordinary skill can readily see that similar ranges of motion can be specified for elbow, wrist, and grip

All motion then extends the range of the tree H*.S*.E*.W*.G*

State information restricts the range of possible motion about the last position of puppet. The state information is mapped to a user model, and further restricts possible ranges of motion. For example,

Hxyz:H* => (HxOO,Hxl l), (HyOO Hyll), (HzOO,Hzll)

A user model adds probabilities for motion at each node of the tree. For example, the user model may identify whether the user moves her shoulders or shakes her wrists. In an exemplary user model for "User n", the probabilities may be expressed as:

Un: (x0,x0), (yO.yl), (z0,zl)

As an alternative to the tree search, the probabilities of motion may be mapped to a voxel space. The probabilities could be precompiled with user probabilities, where three-dimensional motion space is broken up into small cubes, analogous to three-dimensional pixels. Each voxel point can be ascribed a probability, and each voxel can contains a range of motion probability. Voxels may also include application restrictions on the range of motion probability.

Another task for the software is identifying the orientation of the pointing device, to allow use of the orientation as a parameter. In one embodiment, conformation of curves is used as a process of matching the signal curve over time from the three-dimensional sensor space to three- dimensional application space. This is analogous to a three-dimensional jigsaw puzzle or Chinese wood block puzzle, in which a piece may be rotated in any of three dimensions. Using voxel space, it is possible to constrain the orientation of the sensor motion curve to only possible ranges from the "range of motion tree" 800 to only allow possible ranges from the application space. It is further possible to find a most likely orientation of sensor motion curve based on probabilities of each motion represented as likelihood coefficients for each voxel point.

Orientation can be determined by software. The exemplary software uses conformation mapping of curves of X, Y, Z (and optionally P) to the range of motions of either the left or right hand. Mapping could emanate from an explicit model or from trainable classifiers, such as neural nets. An application may request the user to orient the pointing device 100 by holding the device in his or her left or right hand and selecting items on the display 708.

The user either controls an object on the display 708, or navigates in space, where the user acts as the object, with a point of view that shifts. In either case, the range of object motion may be less than the range of motion of the pointing device 100. Thus, the software maps the actual gesture of the input device to the best fitting object motion.

A simple example is controlling a cartoon character that merely has a range of motion of stage-left and stage-right. Here, the range of motion of the puppet is constrained to two dimensions. There are several phases of control that should be considered to gauge its affect in the virtual world. For example: whether the pointing device is operated as a left-handed device or a right-handed device; whether the line 112 is sticking up or down; the current model or user range of motions; the current state of pressure; and the current desired range of controlled object motion. Controlled objects on the display 708 generally have a limited destination range.

As an alternative to the range of motion tree 800, one can require manual orientation of the pointing device 100. For example, the application can show a picture of the pointing device 100 with the lead 112 pointing down to the floor. One hemisphere of the squeezable, deformable cover may have special markings that faces away from the display 708. This technique eliminates the need for conformation of curves, since the orientation is 1:1, puppet to application.

FIG. 9 is a block diagram showing the virtual puppet modules and their connections to each other and the underlying software layer. The user module 900 contains the states of:

1. grasp - The user is picking up device.

2. orient - The user is matching device orientation to application and screen feedback

3. Squeeze - The user is pressing in on squishy covering

4. Move - The user is moving device 5. Release - The user is putting down the device

6. state - The module reports user state to other modules

A sensor module 902 has processes A, B, C, D corresponding to each sensor output, whether analog or digital.

A driver module 904 translates sensor values to application values. The driver module 904 may use a trainable classifier, such as a neural network, where each sensor output has correlating inputs to the drivers idealized X, Y, Z, P. The driver module 904 correlates user states to range of sensor input. The states, "grasp," "orient," "squeeze," "move," and "release" each have a characteristic effect on each of the sensors.

For instance, "grasp" may have a pressure curve, as the user grabs and picks up the puppet. As another example, "squeeze" may have a longer continuous pressure curve when compared to grasp. Each instance of state may have separate set of coefficients for a trainable classifier.

The software correlates the application state to the domain of sensor output, to reduce the set of possible outcomes. For instance, a two-dimensional stage for cartoon characters may be limited to "stage left" and "stage right". The program updates the user module state from the application state and application position.

The application module 906 accepts X, Y, Z, and P data from the driver module 904, and reports the application X, Y, Z, P state to the driver module as state information. Optionally, the application module 906 reports user state information to pass from the driver module 904 to the user module 900.

A mouse module 908 is a special program for emulating a mouse. The mouse module 908 uses a special application module for the report state. Mouse module 908 translates X, Y movement of the pointing device 100 to mouse XN motion, and translates Z and P parameters of the pointing device 100 to mouse Left, Right, and double click.

FIGS. 11-13 show the transformations that are performed by various exemplary software modules, depending on the specific types of sensors 902 included in the pointing device 100.

FIG. 11 covers the example described above, in which the motion detector measures x, y, and z accelerations, and pressure is also measured. The sensors 902 include motion sensors 1100 and pressure sensor 1110. A program Sm 1120 translates the raw output of sensor 1100 to an idealized motion vector, Mxi, Myi, Mzi, 1140 that is provided to the driver 904. A second translator Tm 1160 translates the idealized motion to an application motion vector Mx, My, Mz 1180, which is provided to the application 906. A program Sp 1130 translates the raw output of sensor 1110 to an idealized pressure Pi, 1150 that is provided to the driver 904. A second translator Tp 1170 translates the idealized pressure Pi to an application pressure P 1190, which is provided to the application 906.

FIG. 12 covers a second example, in which an origami detector measures x, y, and z accelerations, and pressure. The sensors 902' include unweighted sensors 1200 and weighted sensors 1210. A program Sm 1220 discriminates between unweighted and weighted sensors to produce an idealized motion vector, Mxi, Myi, Mzi, 1240 that is provided to the driver 904'. A second translator Tm 1260 translates the idealized motion to an application motion vector Mx, My, Mz 1280, which is provided to the application 906'. A program Sp 1230 discriminates between unweighted and weighted sensors to produce an idealized pressure Pi, 1250 that is provided to the driver 904'. A second translator Tp 1270 translates the idealized pressure Pi to an application pressure P 1290, which is provided to the application 906'.

FIG. 13 covers a third example, in which a tetrahedral detector measures x, y, and z accelerations; x, y and z rotations; and pressure. The sensors 902" include tetrahedral sensor 1300. A program Sm 1310 translates the output of sensor 1300 to an idealized motion vector, Mxi, Myi, Mzi, 1340 that is provided to the driver 904". A second translator Tm 1370 translates the idealized motion to an application motion vector Mx, My, Mz 1371, which is provided to the application 906". A third translator Sr 1320 translates the sensor information to an idealized rotation vector Rxi, Ryi, Rzi. A fourth translator Tr 1380 translates the idealized rotation Rxi, Ryi, Rzi to an application rotation Rx, Ry, Rz 1350, which is provided to the application 906". A fifth translator Sp 1330 translates the output of sensor 1300 to an idealized pressure Pi, 1360 that is provided to the driver 904". A sixth translator Tp 1390 translates the idealized pressure Pi to an application pressure P 1391, which is provided to the application 906".

FIG. 14 is a flow chart diagram of an exemplary method for operating the pointing device 100.

At step 1419, the system waits for the user to take an action, i.e., a motion or a squeeze. The system returns to the wait state 1419 after each action, and can transition from the wait state 1419 to any of the states, "grasp" 1420, "orient" 1432, "squeeze" 1437, "move" 1442 or "release" 1447.

At step 1420, the user grasps the device. At step 1421 the sensor 902 registers the motion. At step 1422, the driver module 904 wakes up in "grasp mode".

At step 1423, if this is the first session for the pointing device 100, then step 1424 is executed, and the device is calibrated. Control is transferred to step 1425. At step 1425, if this is the first time the user is using the pointing device 100, then at step 1426, the user registers himself or herself. An association is thus formed between the user and the particular pointing device, making use of the (optional) unique identifier chip in the pointing device 100. Control is transferred to step 1427. At step 1427, if a login is required, then step 1428 is executed. At step 1428, the user logs in, either by using the keyboard, or by a unique motion with the pointing device 100 (such as writing the user's signature). Control is transferred to step 1429.

At step 1429, the system checks whether there are user-specific coefficients available on the system for translating this specific user's style of motion, rotation and pressure from raw sensor data to idealized measurement data. If data are available, then step 1430 is executed. At step 1430, the default coefficients are replaced by the previously determined user-specific coefficients. Control is transferred to step 1431. At step 1431, the user login is translated to rotation and orientation coefficients. The system returns to the wait state 1419

At step 1432, the user orients the device. At step 1433, a rotation calibration is initiated by asking the user to orient the device in one or more predetermined positions. At step 1434, a stream of data points are recorded. At step 1435, conforming X, Y and Z rotations are constructed. At step 1436, a determination is made whether sufficient data have been collected to conform the rotation measurements to the actual rotations. If not, then control is returned to step 1433. Otherwise, the system returns to the wait state 1419.

At step 1437, the user squeezes the pointing device 100. At step 1438, the driver module 904 updates the idealized pressure Pi, and translates the same to the normalized pressure P. At step 1439, the pressure is provided to the application, which processes the pressure. At step 1440, if the application is finished with processing the squeeze operation, then control returns to step 1419 (the wait state). If not, then step 1441 is executed, to determine whether driver module 904 detects a lull or pause in the squeezing. If there is a pause, then control is transferred to step 1419 (wait state). Otherwise, step 1438 is executed again.

At step 1442, the user moves the pointing device 100. At step 1443, the driver module 904 updates the idealized motion vector Mxi, Myi, Mzi, and translates the same into the normalized motion vector Mx, My, Mz. At step 1444, the application program processes the normalized motion vector Mx, My, Mz. At step 1445, if the application is done with processing the movement of the pointing device 100, then control transfers to step 1419 (wait state). Otherwise, step 1446 is executed, to determine whether the driver detects a lull in the motion. If there is a pause, then control is transferred to step 1419 (wait state). Otherwise, step 1443 is executed again.

At step 1447, the user releases the pointing device 100. At step 1448, the driver module 904 enters a sleep state. At step 1449, the application is notified that the pointing device has entered the sleep state.

Listed below is a set of pseudo-code that may be used to construct a software system suitable for handling input signals from the exemplary pointing device 100.

Definition of Terms

A, B, C, D, E, F, G sensor strip oriented along a sensing plane

Au, Bu,... unweighted portion of sensor

Aw, Bw,... weighted portion of sensor

Mxi, Myi, Mzi idealized (un-normalized) X, Y, Z value

Mx, My, Mz, Mp normalized X, Y, Z values used by application

Ox, Oy, Oz orientation vector to translate idealized to normalized XN,Z values

Pi idealized pressure value

P normalized pressure value

Sm translate sensor outputs to idealized motion

Sr translate sensor outputs to idealized rotational motion

Sp translate sensor outputs to idealize pressure

Tm translate idealized motion to application motion Tp translate idealized pressure to application pressure Tr translate idealized rotational motion to application rotational motion

Accelerometer ABC with pressure D

Sm: A maps to Mxi B maps to Mzi C maps to Mzi

Sp: D maps to Pi

Tm: (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz) Tp: Pi maps to P Origami sensor with ABC flaps with each flap having an un-weighted (u) and weighted (w) section

Sp: (Au, Bu, Cu) maps to (Pi)

Sm: (Aw, Bw, Cw, Pi) maps to (Mxi, Myi, Mzi)

Tm: (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz)

Tp: Pi maps to P

Tetrahedral sensor ABCDEF with six strips Sp: (ABCDEF) maps to Pi

Sm: (ABCDEF, Rxi, Ryi, Rzi) maps to Mxi (ABCDEF, Rxi, Ryi, Rzi) maps to Myi (ABCDEF, Rxi, Ryi, Rzi) maps to Mzi

Sr: (ABCDEF, Mxi, Myi, Mzi) maps to Rxi (ABCDEF, Mxi, Myi, Mzi) maps to Ryi (ABCDEF, Mxi, Myi, Mzi) maps to Rzi

Tm: (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz) Tr: (Rxi, Ryi, Rzi) rotationally maps to (Rx, Ry, Rz) Tp: Pi maps to P

Calibration ask user to center device user moves device and wait for activity and the lull or press

ask user to trace shape on XY axis user moves device while software collects ABCD points translate ABCD data points to Ox, Oy sensor coefficients

ask user to move in and out on Z axis user moves device in and out while software collects ABCD points translate ABCD points to Oz sensor coefficients

Registration after calibration ask user to create login signature determine cultural orientation left-right vs. right-left, etc. accept user's input while software collects ABCD points translate input to Ox, Oy, Oz over time

Login ask user to reproduce login signature set orientation based upon cultural left-right vs. right-left context replaces manual orientation to draw shape on X-Y axis

Orientation if manual orientation ask user to move input device to center of area with cord facing direction (floor) and front face facing direction set Ox, Oy, Oz to (0,0,0) else while waiting for input device to stop or pressure state to signal squeeze state save last few Mxi, Myi, Mzi states

Although the exemplary application program described above is a virtual puppet program, there are many uses for a pointing device 100 according to the present invention. For example, the output of the pointing device may be mapped to the parameters of a visual synthesizer, such as the shifting of a color palette, the drawing or motion of an object. Alternatively, the output of device 100 may be mapped to parameters of a sound synthesizer, such as pitch, timbre, amplitude, or placement in three-dimensional sonic space.

Although the exemplary virtual puppet application is configured so that one pointing device 100 controls a single virtual puppet 710, the software can be configured to operate more than one virtual puppet (or other object) using a single input pointing device 100. For example, assuming that the pointing device includes sensors for measuring x, y, and z accelerations and pressure P, the first puppet 710 may be controlled by the x and y accelerations, and the second puppet 711 may be controlled by z and P. The first and second objects need not be identical objects, or even objects of the same type.

For example, the first object may be a virtual puppet, and the second object may be the background scenery or sound. By x and y movements, the puppet 710 is controlled, and by z and P, the user can vary the background from day to night, or from silence to the sound of thunder.

To use two input devices 100, software applications can us the Mouse API for selecting or navigating or use a Driver API. For example, the driver for Digitizer (or Graphics) tablets may allow more than one pen to be used at the same time. Emulating a mouse can be accomplished by installing a driver and going beyond a mouse can be accomplished by modifying the application program. One can also overlay other input devices, such as a graphics tablet, with yet another driver.

Further, the output signals from the exemplary pointing device 100 may be mapped to those of conventional input devices in an emulation mode. For example, the outputs may be mapped to the x and y positions of a conventional mouse and the left or right click signals from depressing the mouse buttons. Mapping the x, y and z (or x, y and P) parameters into x and y parameters only may require the computer to determine the orientation of the pointing device 100, such as moving on a predominantly x-y plane, a y-z plane, or the z-x plane, or combinations of all three. Alternatively, individual user profiles may be used to determine the current orientation.

Although the exemplary use for two pointing devices is to control two different virtual puppets, a second pointing device may be used to expand the number of parameters for an application. For example, in an audiovisual application, one pointing device 100 may control various aspects of the video signal, while another pointing device may be used to control the audio signals.

Although the exemplary pointing device is sensitive to motion and pressure, additional sensors may also be included. For example, a microphone may be incorporated into the pointing device, for operation of voice activated software, or for storage of sounds.

Although the exemplary embodiment described above has the virtual puppet application program running locally in the computer 701, the application program may be executed in other processors located remotely, either as a sole process or in parallel with other instances of the application program on other processors. The local computer can communicate with other computers via a telecommunication network, which may be a LAN, WAN, Internet, or wireless protocols.

The present invention may be embodied in the form of computer-implemented processes and apparatus for practicing those processes. The present invention may also be embodied in the form of computer program code embodied in tangible media, such as floppy diskettes, read only memories (ROMs), CD-ROMs, hard drives, "ZLP™ drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention. The present invention may also be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over the electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention. When implemented on a general -purpose processor, the computer program code segments configure the processor to create specific logic circuits.

Although the invention has been described in terms of exemplary embodiments, it is not limited thereto. Rather, the appended claim should be construed broadly, to include other variants and embodiments of the invention that may be made by those skilled in the art without departing from the scope and range of equivalents of the invention.

Claims

What is claimed is:
1. A hand-held computer input pointing device, comprising: at least one motion detector, the at least one motion detector being capable of detecting motion in at least three dimensions; and at least one pressure sensor capable of sensing pressure quantitatively, wherein the input device is operable within a hand of a user to transmit signals from the motion detector and the pressure sensor, without contacting a base, work surface or pad.
2. The pointing device of claim 1 , wherein the at least one motion detector includes at least one of the group consisting of an accelerometer and a piezoelectric device.
3. The pointing device of claim 2, wherein the at least one motion detector includes at least three piezoelectric strips oriented in three orthogonal directions.
4. The pointing device of claim 3, wherein each of the three piezoelectric strips has a respective weight thereon to translate motion to a voltage.
5. The pointing device of claim 3, wherein the piezoelectric device is an origami sensor having cut coils, the origami sensor having either a cubic or tetrahedral shape.
6. The pointing device of claim 1, wherein the motion sensor includes a tetrahedral shaped device having six edge sensors.
7. The pointing device of claim 1, wherein the motion detector and pressure sensor are contained within a squeezable housing.
8. The pointing device of claim 7, wherein the housing is an elastomeric ball.
9. The pointing device of claim 1, wherein the motion sensor is capable of sensing pitch, yaw and roll.
10. The pointing device of claim 1, wherein the motion sensor is capable of sensing x- acceleration, y-acceleration and z-acceleration.
11. The pointing device of claim 1, further comprising a memory device that stores and outputs an identifier that is unique to the computer input pointing device.
12. A computer system comprising: a computer having a processor, a memory, an output device, and an input port; and a hand-held computer input pointing device coupled to the input port of the computer, the pointing device comprising: at least one motion detector, the at least one motion detector being capable of detecting motion in at least three dimensions; and at least one pressure sensor capable of sensing pressure quantitatively, wherein the input device is operable within a hand of a user to transmit signals from the motion detector and the pressure sensor, without contacting a base, work surface or pad.
13. The computer system of claim 12, wherein: the output device is a display; the computer further comprises a storage device, and one of the group consisting of the memory and the storage device has stored therein computer program code for causing the computer to show an object on the display, the object having a plurality of portions that are selectively movable in response to motion or squeezing of the pointing device.
14. The computer system of claim 13, wherein the object is an animated character controlled by input parameters.
15. The computer system of claim 14, further comprising a hub and at least a second computer input pointing device capable of sensing motion in at least three dimensions and a pressure sensor, the first and second computer input pointing devices being connected to the USB hub, wherein each of the first and second computer input pointing devices controls a respecti-v© animated character.
16. The computer system of claim 14, wherein the computer program code includes a range of motion tree for the animated character, the range of motion tree restricts a range of possible motion of the animated character about a current position of the animated character.
17. The computer system of claim 16, wherein the computer program code includes a respective unique range of motion tree for each of a plurality of users
18 The computer system of claim 16, wherein the computer program code includes a module for using output signals from the computer input pointing device to emulate output signals from a mouse
19 The pointing device of claim 4, wherein each of the three piezoelectric strips has at least two sections, including a weighted section and an unweighted section
20 The pointing device of claim 5, wherein the cut coils on each face of the motion detector are attached to a single weight at a center of the motion detector
21 The pointing device of claim 1, further compπsing a multiplexer that receives output signals from the motion detector and the pressure sensor, for outputtmg a multiplexed signal for transmission to a computer, wherein the motion detector, pressure sensor and multiplexer are contained within a squeezable housing.
22. The pointing device of claim 7, wherein the squeezable housing has at least one tactile feature to provide a tactile cue for orienting the pointing device.
23. The pointing device of claim 7, wherein the squeezable housing has at least one visible feature to provide a visual cue for oπenting the pointing device.
24. A method for operating a computer having an output device, compπsing the steps of: receiving a plurality of signals from a computer input pointing device, the signals representing quantifiable pressure and motion in at least three dimensions; and outputtmg a video, audio or tactile output signal to the output device, the output signal having at least two charactenstics that are capable of being vaπed separately from each other.
25. The method of claim 24, wherein: the output device is a display; the step of outputtmg includes displaying an animated character on the display, and moving at least two portions the animated character separately from each other.
26. A method for operating a computer input pointing device, comprising the steps of: moving the pointing device; transmitting at least three motion signals from the pointing device to a computer, the motion signals including signals representing translations in one or more dimension, or rotations in one or more dimensions, or a combination of translations and rotations; squeezing at least one portion of the pointing device; and transmitting quantifiable pressure signals from the pointing device to the computer.
27. The method of claim 26, further comprising the steps of: speaking into a microphone contained within the pointing device; and transmitting an audio signal from the microphone to the computer.
EP00926157A 1999-04-20 2000-04-20 Human gestural input device with motion and pressure Withdrawn EP1250698A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US13019199P true 1999-04-20 1999-04-20
US130191P 1999-04-20
PCT/US2000/010579 WO2000063874A1 (en) 1999-04-20 2000-04-20 Human gestural input device with motion and pressure

Publications (2)

Publication Number Publication Date
EP1250698A2 true EP1250698A2 (en) 2002-10-23
EP1250698A4 EP1250698A4 (en) 2002-10-23

Family

ID=22443485

Family Applications (1)

Application Number Title Priority Date Filing Date
EP00926157A Withdrawn EP1250698A2 (en) 1999-04-20 2000-04-20 Human gestural input device with motion and pressure

Country Status (3)

Country Link
EP (1) EP1250698A2 (en)
AU (1) AU4473000A (en)
WO (1) WO2000063874A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8257439B2 (en) 2004-12-22 2012-09-04 Ldr Medical Intervertebral disc prosthesis
WO2019073490A1 (en) * 2017-10-12 2019-04-18 Glenn Fernandes 3d mouse and ultrafast keyboard

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7749089B1 (en) 1999-02-26 2010-07-06 Creative Kingdoms, Llc Multi-media interactive play system
GB2358108A (en) 1999-11-29 2001-07-11 Nokia Mobile Phones Ltd Controlling a hand-held communication device
US9446319B2 (en) 2003-03-25 2016-09-20 Mq Gaming, Llc Interactive gaming toy
US6761637B2 (en) 2000-02-22 2004-07-13 Creative Kingdoms, Llc Method of game play using RFID tracking device
US7445550B2 (en) 2000-02-22 2008-11-04 Creative Kingdoms, Llc Magical wand and interactive play experience
US7878905B2 (en) 2000-02-22 2011-02-01 Creative Kingdoms, Llc Multi-layered interactive play experience
US7066781B2 (en) 2000-10-20 2006-06-27 Denise Chapman Weston Children's toy with wireless tag/transponder
DE10203209B4 (en) * 2002-01-28 2004-08-12 Siemens Ag Device with an input element for controlling a display unit
US6990639B2 (en) 2002-02-07 2006-01-24 Microsoft Corporation System and process for controlling electronic components in a ubiquitous computing environment using multimodal integration
US6967566B2 (en) 2002-04-05 2005-11-22 Creative Kingdoms, Llc Live-action interactive adventure game
US20070066396A1 (en) 2002-04-05 2007-03-22 Denise Chapman Weston Retail methods for providing an interactive product to a consumer
KR100932483B1 (en) * 2002-11-20 2009-12-17 엘지전자 주식회사 A mobile communication terminal and method using the same remote control avatar
FR2858073B1 (en) * 2003-07-24 2007-08-10 Adentis Method and system for gestural control of an apparatus
FI117308B (en) 2004-02-06 2006-08-31 Nokia Corp gesture Control
JP4805633B2 (en) 2005-08-22 2011-11-02 任天堂株式会社 Game operation device
US8870655B2 (en) 2005-08-24 2014-10-28 Nintendo Co., Ltd. Wireless game controllers
JP4262726B2 (en) 2005-08-24 2009-05-13 任天堂株式会社 Game controller and game system
US7927216B2 (en) 2005-09-15 2011-04-19 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
WO2007030947A1 (en) * 2005-09-16 2007-03-22 Anthony Szturm Mapping motion sensors to standard input devices
EP1821180A1 (en) * 2005-12-31 2007-08-22 Ball-IT Oy User operable pointing device such as mouse
US9390229B1 (en) 2006-04-26 2016-07-12 Dp Technologies, Inc. Method and apparatus for a health phone
JP5204381B2 (en) 2006-05-01 2013-06-05 任天堂株式会社 Game program, game device, game system, and game processing method
US8902154B1 (en) 2006-07-11 2014-12-02 Dp Technologies, Inc. Method and apparatus for utilizing motion user interface
GB0700163D0 (en) * 2007-01-05 2007-02-14 Kord Ali A Display Booth
EP2132650A4 (en) * 2007-03-01 2010-10-27 Sony Comp Entertainment Us System and method for communicating with a virtual world
US8555282B1 (en) 2007-07-27 2013-10-08 Dp Technologies, Inc. Optimizing preemptive operating system with motion sensing
US7800044B1 (en) 2007-11-09 2010-09-21 Dp Technologies, Inc. High ambient motion environment detection eliminate accidental activation of a device
US8996332B2 (en) 2008-06-24 2015-03-31 Dp Technologies, Inc. Program setting adjustments based on activity identification
US9358425B2 (en) * 2008-08-12 2016-06-07 Koninklijke Philips N.V. Motion detection system
US8300020B2 (en) * 2008-08-15 2012-10-30 Apple Inc. Hybrid inertial and touch sensing input device
US8872646B2 (en) 2008-10-08 2014-10-28 Dp Technologies, Inc. Method and system for waking up a device due to motion
WO2012079948A1 (en) * 2010-12-16 2012-06-21 International Business Machines Corporation A human interface device with two three-axis-accelerometers
FR2972819A1 (en) * 2011-03-15 2012-09-21 France Telecom Device for capturing data representative of reaction of e.g. user to stress and/or tiredness state in candidate object, has data processing device, where data including given deformation of material is transmitted to processing device
EP2981876A4 (en) * 2013-04-02 2016-12-07 Nokia Technologies Oy An apparatus
CN104142823B (en) 2014-06-30 2016-04-27 腾讯科技(深圳)有限公司 A kind of input equipment and control system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997027575A1 (en) * 1996-01-24 1997-07-31 Personal Integrated Network Node Communications Corporation Smart orientation sensing circuit for remote control
WO1997039401A1 (en) * 1996-04-12 1997-10-23 Paul Milgram Finger manipulatable 6 degree-of-freedom input device
US5813406A (en) * 1988-10-14 1998-09-29 The Board Of Trustees Of The Leland Stanford Junior University Strain-sensing goniometers, systems and recognition algorithms

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5757360A (en) * 1995-05-03 1998-05-26 Mitsubishi Electric Information Technology Center America, Inc. Hand held computer control device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5813406A (en) * 1988-10-14 1998-09-29 The Board Of Trustees Of The Leland Stanford Junior University Strain-sensing goniometers, systems and recognition algorithms
WO1997027575A1 (en) * 1996-01-24 1997-07-31 Personal Integrated Network Node Communications Corporation Smart orientation sensing circuit for remote control
WO1997039401A1 (en) * 1996-04-12 1997-10-23 Paul Milgram Finger manipulatable 6 degree-of-freedom input device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO0063874A1 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8257439B2 (en) 2004-12-22 2012-09-04 Ldr Medical Intervertebral disc prosthesis
WO2019073490A1 (en) * 2017-10-12 2019-04-18 Glenn Fernandes 3d mouse and ultrafast keyboard

Also Published As

Publication number Publication date
EP1250698A4 (en) 2002-10-23
WO2000063874A1 (en) 2000-10-26
AU4473000A (en) 2000-11-02

Similar Documents

Publication Publication Date Title
Kela et al. Accelerometer-based gesture control for a design environment
Wanderley et al. Evaluation of input devices for musical expression: Borrowing tools from hci
EP0864144B1 (en) Method and apparatus for providing force feedback for a graphical user interface
US7934995B2 (en) Game system and information processing system
US6061004A (en) Providing force feedback using an interface device including an indexing function
US7969418B2 (en) 3-D computer input device and method
US6259382B1 (en) Isotonic-isometric force feedback interface
Mackinlay et al. A semantic analysis of the design space of input devices
US7627139B2 (en) Computer image and audio processing of intensity and input devices for interfacing with a computer program
US7253803B2 (en) Force feedback interface device with sensor
US7683883B2 (en) 3D mouse and game controller based on spherical coordinates system and system for use
US8502825B2 (en) Avatar email and methods for communicating between real and virtual worlds
JP4065618B2 (en) Computer user interface and computer
US7106313B2 (en) Force feedback interface device with force functionality button
US8655622B2 (en) Method and apparatus for interpreting orientation invariant motion
US5805137A (en) Touch sensitive input control device
US20170228023A1 (en) Vehicle computing system to provide a vehicle safety warning
JP5325771B2 (en) System, method, apparatus, and program for detecting timing of swing impact and / or strength of swing based on accelerometer data
EP1967942A1 (en) System and method for interfacing and computer program
CN100590575C (en) Multi-modal navigation in a graphical user interface computing system
US5717610A (en) Coordinate input device
EP1804154A2 (en) Computer input device enabling three degrees of freedom and related input and feedback methods
US8019121B2 (en) Method and system for processing intensity from input devices for interfacing with a computer program
US8125448B2 (en) Wearable computer pointing device
KR101666096B1 (en) System and method for enhanced gesture-based interaction

Legal Events

Date Code Title Description
A4 Despatch of supplementary search report

Effective date: 20020726

AK Designated contracting states:

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

Kind code of ref document: A4

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

17P Request for examination filed

Effective date: 20011116

18D Deemed to be withdrawn

Effective date: 20021011