WO2000063874A1 - Human gestural input device with motion and pressure - Google Patents

Human gestural input device with motion and pressure Download PDF

Info

Publication number
WO2000063874A1
WO2000063874A1 PCT/US2000/010579 US0010579W WO0063874A1 WO 2000063874 A1 WO2000063874 A1 WO 2000063874A1 US 0010579 W US0010579 W US 0010579W WO 0063874 A1 WO0063874 A1 WO 0063874A1
Authority
WO
WIPO (PCT)
Prior art keywords
pointing device
motion
computer
sensor
pressure
Prior art date
Application number
PCT/US2000/010579
Other languages
French (fr)
Other versions
WO2000063874A9 (en
Inventor
John Warren Stringer
Original Assignee
John Warren Stringer
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by John Warren Stringer filed Critical John Warren Stringer
Priority to AU44730/00A priority Critical patent/AU4473000A/en
Priority to EP00926157A priority patent/EP1250698A2/en
Publication of WO2000063874A1 publication Critical patent/WO2000063874A1/en
Publication of WO2000063874A9 publication Critical patent/WO2000063874A9/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10NELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10N30/00Piezoelectric or electrostrictive devices
    • H10N30/30Piezoelectric or electrostrictive devices with mechanical input and electrical output, e.g. functioning as generators or sensors
    • H10N30/302Sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0382Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC

Definitions

  • the present invention relates the field of input devices for computers.
  • Computer input pointing devices are well known.
  • a typical computer input pointing device contains at least two sensors for sensing motion in at least two directions, such as x and y (forward-backward and left-right).
  • the pointing device also has at least one actuator for causing the pointing device to transmit a command signal (typically referred to as a "click") to a computer.”
  • a command signal typically referred to as a "click”
  • the most common pointing device for a desktop computer or workstation is a mouse.
  • the mouse may have a ball on its underside. Motion of the ball causes rotation of one or more of a plurality of rotation sensors adjacent to the ball. The number of rotations determine the magnitude of the motion in the x or y direction.
  • the most common mouse type includes two buttons for sending commands known as "left-click" and "right-click.
  • Other types of mice include optical sensors that count markings on a specially marked mouse pad to determine an amount of movement.
  • the joystick has a lever that is tilted in a forward-backward or left-right direction, each of which is sensed independently.
  • a button is typically provided on the end of the lever for transmitting a command signal.
  • Some joysticks also allow rotation of the lever, which is also sensed.
  • Alternative pointing devices are typically used on laptop computers.
  • An early pointing device ⁇ s ⁇ d on laptop computers is the track ball.
  • the track ball functions like an upside-down mouse. Instead of moving the device, the ball is rotated directly.
  • Many laptop computers include a miniature joystick that is positioned on the home row of the keyboard.
  • Another common pointing device in laptop computers is a touch pad.
  • the touch pad is a rectangular device that is sensitive to touch. Left and right sliding motion of a finger on the touch pad is detected. The touch pad also senses when it is struck, and produces a "click" signal.
  • conventional pointing devices are suitable for locating a point on an x-y grid, and transmitting a simple single-valued command signal, conventional pointing devices leave much to be desired for controlling application programs that require more complex inputs.
  • One aspect of the present invention is a hand-held computer input pointing device, comprising at least one motion detector.
  • the at least one motion detector is capable of detecting motion in at least three dimensions.
  • At least one pressure sensor is capable of sensing pressure quantitatively.
  • the input device is operable within a hand of a user to transmit signals from the motion detector and the pressure sensor, without contacting a base, work surface or pad.
  • Another aspect of the invention is a method for operating a computer having an output device, comprising the steps of: receiving a plurality of signals from a computer input pointing device, the signals representing quantifiable pressure and motion in at least three dimensions; and outputting a video, audio or tactile output signal to the output device, the output signal having at least two characteristics that are capable of being varied separately from each other.
  • Still another aspect of the invention is a method for operating a computer input pointing device, comprising the steps of: moving the pointing device; transmitting at least three motion signals from the pointing device to a computer, the motion signals including signals representing translations in one or more dimension, or rotations in one or more dimensions, or a combination of translations and rotations; squeezing at least one portion of the pointing device; and transmitting quantifiable pressure signals from the pointing device to the computer.
  • FIG. 1 is an isometric view of an exemplary squeezable pointing device according to the present invention.
  • FIG. 2 is an exploded view of the pointing device of FIG. 1.
  • FIG. 3A is an enlarged view of the origami sensor of FIG. 2.
  • FIG. 3B is a plan view of the sensor of FIG. 3 A, prior to folding.
  • FIG. 4A is a plan view of a tetrahedral origami sensor in a flattened state.
  • FIG. 4B is an isometric view of the sensor of FIG. 4A, folded.
  • FIG. 5A is a plan view of an alternative cubic origami sensor in a flattened state.
  • FIG. 5B is an isometric view of the sensor of FIG. 5A, folded.
  • FIG. 6 is an isometric view of a tetrahedral edge sensor.
  • FIG. 7 is an elevation view of a computer system including the pointing device of FIG. 1.
  • FIG. 8 is a diagram of a range of motion tree for an application program executed on the computer shown in FIG. 7.
  • FIG. 9 shows the software layers for the application executed on the computer shown in FIG. 7.
  • FIG. 10 is a flow chart of a method for operating the virtual puppet of FIG. 7.
  • FIGS. 11-13 are block diagrams of the software layer structure shown in FIG. 9.
  • FIG. 14 is a flow chart of a method for operating the pointing device of FIG. 2.
  • FIG. 1 shows an exemplary computer input pointing device 100 according to the present invention.
  • Pointing device 100 is capable of detecting motion and providing a quantifiable measure of squeeze pressure.
  • the input device 100 is operable in a hand of a user, without contacting a base, work surface, or pad. Thus, the input device 100 is "free floating," in that it may be operated in any position without regard to the presence of a base, work surface, or pad.
  • Pointing device 100 has a squeezable, deformable elastomeric cover 102. Inside the cover 102, the pointing device 100 has an internal motion detector that may measure acceleration in x, y and z planes, or measure pitch, yaw and roll. Squeezing the package increases pressure that translates into a pressure signal that is sent by the pointing device 100 to the computer (shown in FIG. 7) along with the motion data.
  • FIG. 2 is an exploded view of the computer input pointing device 100, with the squeezable elastomeric covering 102 partially removed.
  • the squeezable covering 102 may be formed from natural or synthetic rubber or a variety of polymeric materials, which may be cut or injection molded.
  • Cover 102 has an internal cavity 104 for containing sensors.
  • An optional sensor enclosure 106 may be provided inside of cover 102.
  • the sensor enclosure 106 may be sealed to protect the sensors from dust and moisture, allowing use of an unsealed cover 102.
  • the deformable cover 102, with or without the sensor enclosure 106 allows the sensor to move freely, and compresses internal sensor.
  • the squeezable cover is made from a "SQEESHTM” ball by the Toysmim Company of Kent, WA USA.
  • "SQEESHTM” balls are available in a variety of shapes and surface types, including smooth spherical, spherical with craters, and star shaped. A variety of shapes may be used to provide users with the grip of their choice or hints as to what object that the device controls.
  • Pointing device 100 has at least one motion detector 108, which is capable of detecting motion in at least three dimensions.
  • Pointing device 100 also includes at least one pressure sensor capable of sensing pressure quantitatively.
  • the pressure sensing capability is also provided by the sensor 108, which is described in greater detail below.
  • the at least one motion detector may, for example, include an accelerometer or a piezoelectric device.
  • the exemplary piezoelectric device 108 includes at least three piezoelectric strips 301-303 or flaps oriented in three orthogonal directions. More specifically, the exemplary device 108 is a single film sensor having three strips 301-303 folded origami style, as shown in FIG. 3A, to form a piezo thin film with six sensors.
  • FIG. 3B is a view of the device 108 with six sensors 314, 316.
  • the device is formed on a substrate 300 and is shown laid flat before the origami folds along dashed lines 312.
  • Optional weight W1-W3 may be attached to the end or center of each flap 301-303, respectively.
  • each of the three piezoelectric strips 301-303 has at least two sections, including a weighted section 314 and an unweighted section 316; so as to discriminate motion load from pressure load.
  • the piezo film substrate 300 is plastic and allows two strips of piezo material to measure bend of the substrate by the relative tension between two strips.
  • One piezo sensor 316 is provided for pressure discrimination. The pressure on the deformable covering 102 bends the material of the piezo pressure sensor 316, producing a measurable voltage.
  • Another piezo sensor 314 for motion discrimination has a weight W1-W3 placed at the middle or end of the sensor. If each of the three piezoelectric strips 314 has a respective weight; moving the pointing device 100 induces an inertial force. The weight W1-W3 moves against the material of the motion sensor 314, causing it to bend at a different rate than pressure sensor 316.
  • Cuts 318 on the exemplary piezo film substrate 300 allow each sensor flap 314, 316 to bend separately.
  • the dotted lines 312 show folds where the flat sensor strip 300 is origami folded in 90 degree bends.
  • the folded origami flaps 301-303 are affixed to a support assembly 320 (shown in FIG. 3A).
  • the support assembly 320 holds the origami flaps 301-303 in place.
  • the leads L from thin film sensor are connected to optional internal multiplexing circuitry 110 that connects the sensor circuitry to an external lead 112.
  • the sensors output data representing four parameters (three motion and one pressure).
  • a multiplexer 110 receives output signals from the motion detector and the pressure sensor 108, and outputs a multiplexed signal for transmission to a computer (shown in FIG. 7).
  • the output of the multiplexer is connected to a lead 112.
  • the lead 112 extends through a channel 103 in the elastomeric covering 102.
  • the opposite end of the lead 112 has a suitable connector 114 attached to it, which may be a 0.32-centimeter (1/8 inch) miniplug, a Universal Serial Bus (USB) connector, a serial (RS-232) port, an "ETHERNETTM” port, a MIDI port for connection to a musical instrument, a PDA port, a cellular phone port or other suitable connector.
  • a suitable connector 114 attached to it, which may be a 0.32-centimeter (1/8 inch) miniplug, a Universal Serial Bus (USB) connector, a serial (RS-232) port, an "ETHERNETTM” port, a MIDI port for connection to a musical instrument, a PDA port, a cellular phone port or other suitable connector.
  • a hub capable of accepting inputs from multiple input devices may be provided to interface the plurality of pointing devices 100 to a single computer.
  • a USB hub 116 (connected to lead 118 and USB connector 120) may be used to connect the pointing devices to a single computer.
  • the pointing device may omit the multiplexer, and output four analog signals from four respective output terminals.
  • the pointing device 100 may have a wireless transmitter for communicating with the computer.
  • the lead 112 from the sensing package either connects one data line (carrying time or frequency multiplexed data), or multiple data lines each carrying a respective sensor's output.
  • the connector (such as 1/8" miniplug) 114 optionally connects directly to the computer (FIG. 7) through a microphone input eliminating the need for a hub.
  • the pointing device 100 may use the audio software driver or communicate through the optional hub 116 for controlling multiple puppets (as described below).
  • the connector 114 may connect to a different computer port such as a USB or serial port, eliminating the need for a hub.
  • FIG. 2 shows a two-input version, the system may include any number of inputs.
  • any of the sensor types described herein may be used with clock shifting to the next lead.
  • An additional clock cycle may be provided for subsequent processing of an optional unique identification number (described below).
  • the sensors provide an analog signal that may be modulated with a single carrier frequency, or the signal from each lead may be modulated with a respectively different carrier frequency.
  • An optional analog to digital converter may be provided for each of the analog sensor leads.
  • ADC analog to digital converter
  • a clock shift to the next lead sends a digital packet. For example, with eight bits per lead, which for 6 sensors yields six bytes per sample cycle.
  • the ID circuit may be a register or a read only memory (ROM) device, EEPROM, or the like.
  • the unique identifier may be hard encoded at the factory (e.g., using a ROM), or the identifier may be downloaded from a computer into a flash EEPROM in the pointing device 100. As explained below with reference to FIG. 7, by providing a unique identifier for each pointing device unit 100, more than one unit may be used at the same time.
  • the device identifier can function like an encryption key, to protect the user's preferences or to securely identify the user.
  • the output of the LD circuit and the output from one of the sensors are output to the multiplexing circuitry 110.
  • the analog input may be time-domain multiplexed with the ID waveform.
  • a digital input packet may be combined with ID information to form a larger packet including both.
  • the pointing device 100 may contain additional identifier circuitry to add a port identifier number separate from the pointing device identifier number.
  • the analog waveform may have a separate carrier frequency or may be time domain multiplexed or interleaved with each sample, either on a bit, byte, or packet basis.
  • the sensor data and ID information may be added to a multiplexing packet to form a larger packet including both types of data.
  • the pointing device 100 may obtain power from the computer (FIG. 7), from an internal battery, or by drawing power directly from the energy created by the sensor from the piezoelectric effect.
  • the piezoelectric device may be an origami sensor having cut coils, and may have either a cubic or tetrahedral shape.
  • the cut coils on each face of the motion detector are attached to a single weight at a center of the motion detector.
  • each cut coil on each face of the motion detector has its own weight attached.
  • FIGS. 5 A and 5B show a sensor system 500 including origami cutouts, which may be used in place of the sensor 108 shown in FIG. 2.
  • a thin sensor film 500 having six faces 501- 506 is folded to form a cube (although a tetrahedron, shown in FIGS. 4A and 4B, or other three- dimensional shapes may be used in alternative embodiments).
  • a spiral pattern 511-516 is cut into each face of thin sensor film 501-506, respectively.
  • Each coil can be pressed towards the center (e.g., Cl, C3 and C6) and attached to a weight that is suspended at the center of the folded shape.
  • each face 501-506 has a corresponding opposite face measuring force with an inverse effect on voltage.
  • each side faces three other sides, measuring an inverse effect on their faces.
  • Additional circuitry may be included in the exemplary sensor configuration.
  • An analog sample and hold is provided for each of the analog sensor leads.
  • the leads are:
  • the leads are:
  • the leads are:
  • FIG. 6 shows a tetrahedral edge sensor 600.
  • Sensor 600 has six edges to detect pressure directly from a squeeze of the deformable elastomeric material of cover 102.
  • sensor 600 may have weighted edges 602 to detect motion through inertial force whenever the pointing device 100 is moved.
  • the weights 610 increase the momentum of the sensors 602, so that the deformation of the sensors is exaggerated.
  • weights may be placed at each node 608.
  • Sensor 600 is used in conjunction with an application driver that can discriminate pressure from motion.
  • the driver can discriminate pitch, yaw, and roll.
  • a plurality of signal leads 604 are provided, one for each edge of the tetrahedron 600.
  • the outputs 604 of the sensor either connect to multiplexing/conversion circuitry 110 or lead out to separate analog lines.
  • each connecting node 608 broadcasts a wireless signal of its state without requiring any lead wire.
  • non-sensing leads connect each node 608 to another edge 602.
  • the non-sensing leads 606 either follow a sensing edge 602 to another node 608 or may run directly to another node 608.
  • the connecting nodes 608 for leads 606 and sensing edges 602 may simply be vertices for an origami tetrahedron or connecting nodes for self supporting edge sensors.
  • a sensor similar to that used in the "GYROPOENTTM" mouse may be used.
  • a sensor such as the "ACH-04-08-05” accelerometer/shock sensor from Measurement Specialties, Inc. of Valley Forge PA may be used.
  • the ACH-04-08-05 sensor has three piezoelectric sensing elements oriented to measure acceleration in x, y and z linear axes, and uses very low power.
  • an "ACH-04-08-01" sensor by Measurement Specialties, Inc. may be used to measure linear acceleration in the x and y directions and rotation about the z axis.
  • pointing device 100 also includes a pressure sensing capability.
  • An internal pressure sensor is included, which may be, for example, a thin film piezo sensor that detects a bending of a strip or a sensor such as the "MTC ExpressTM” from Tactex Controls Inc. of Cerritos, CA.
  • This control surface is pressure sensitive, and also senses multiple points of contact, using a fiber optic-based, pressure sensitive, Smart Fabric called “KTNOTEXTM.”
  • This fabric is constructed of cellular urethane or silicone sandwiched between protective membranes. The fabric has an embedded fiber optic array that generates an optical signal when the material is touched, or when force is applied. The coordinates and area of the pressure, and its magnitude can be determined from the received signals.
  • the material can generate a pressure profile of the shape of a hand touching it.
  • the pointing device 100 may include a circuit that outputs a unique identifier for each pointing device.
  • the feature allows the pointing device 100 to serve a user identification and/or authentication function.
  • a user could carry around his or her pointing device with a unique identifier output signal, and connect it to any desired computer having an Internet connection.
  • the application program could request a remote server to download previously established software preferences or upload the user's current software preferences. This enables a person to log on at any computer connected to the Internet, access his or her software (e.g., virtual puppet and stage application), or download web bookmarks.
  • the user could use the pointing device 100 as a key to access copy protected software, functioning like a parallel port dongle.
  • the use of a pointing device 100 with a built-in unique identifier enhances the user's mobility and access to services.
  • the pointing device may be used for authentication at any computer to which it is connected.
  • the unit uniquely identifies itself, the unit can be used in a way that allows it to bind to each individual user.
  • the unique identifier may be used in combination with a password, IP address or a user signature.
  • the user may sign his or her name using the pointing device to control movement of a virtual writing implement.
  • the user's handwriting may be recognized using a conventional handwriting recognition algorithm, such as any of those described in U.S. Patents 5768423, 5757959, 5710916, 5649023 or 5553284, all of which are expressly incorporated by reference herein in their entireties.
  • the combination of the unique device identifier and the user signature provides reliable authentication.
  • the unique identifier may be used to make the user's preferences more portable.
  • the identifier may be associated with the design (i.e., shape, color, and the like) of the cover 102.
  • the identifier for the pointing device may also be associated with different shapes and colors of cursors.
  • the cursor that appears on the screen may have the same shape or color as the cover 102 of the input device 100, which is associated with a known identifier.
  • a blue spherical pointing device may be associated with a round, blue cursor
  • a puppet shaped pointing device may be associated with a puppet shaped cursor
  • an alien-shaped pointing device may be associated with an alien-shaped cursor, and so on.
  • the input device packaging may be completely different from the shape and color of the computer cursor, or puppet object.
  • the puppet application program is stored in a server that is accessible via the Internet, or other local area network (LAN) or wide area network (WAN) or wireless services such as CDMA, the user may select from any of a plurality of different puppets from a gallery. This is one of the user preferences that may follow her around from client computer to client computer.
  • FIG. 7 shows a computer system 700 according to another aspect of the invention.
  • the computer system 700 includes a computer 701 having a processor 702, a memory 706, a display 708, and an input port 712.
  • At least one computer input pointing device 100 is coupled to the input port 712 of the computer 701.
  • the pointing device 100 has at least one motion detector 108, capable of detecting motion in at least three dimensions, and at least one pressure sensor capable of sensing pressure quantitatively.
  • a multiplexer 110 receives output signals from the motion detector 108 and the pressure sensor, and outputs a signal for transmission to the computer 701. Alternatively, the lead from each sensor may output an analog signal.
  • the computer 701 further comprises a storage device 704, which may be a hard disk drive, a high capacity removable disk drive (e.g., "ZIP” drive by Iomega Corporation), a read only memory (ROM), CD-ROM drive, a digital versatile disk (DVD) ROM drive, or the like.
  • a storage device 704 which may be a hard disk drive, a high capacity removable disk drive (e.g., "ZIP” drive by Iomega Corporation), a read only memory (ROM), CD-ROM drive, a digital versatile disk (DVD) ROM drive, or the like.
  • Either the memory 706 or the storage device 704 has stored therein computer program code for causing the computer to show an object 710 on the display 708.
  • the object has a plurality of portions 710a-710d that are selectively movable in response to motion or squeezing of the pointing device 100.
  • the exemplary object is an animated character 710 controlled by input parameters from the pointing device. Such an animated character is
  • a method for controlling a computer. The method includes the steps of: receiving a plurality of signals from a computer input pointing device 100, the signals representing quantifiable pressure and motion in at least three dimensions; and outputting a video, audio or tactile output signal to the output device 100, the output signal having at least two characteristics that are capable of being varied separately from each other.
  • FIG. 10 shows an exemplary method for operating the computer 701. The method includes the steps of:
  • the animated character 710, figure or other object is displayed on the display 708.
  • the driver module 904 updates the idealized pressure value Pi and translates Pi into the normalized pressure value P.
  • the application program processes the normalized pressure P into an animated figure action.
  • the application updates the animated figure according to the action, and displays the result on the display 708.
  • steps 1070-1080 are executed.
  • the driver module 904 of the software updates the idealized motion vector Mxi, Myi, Mzi and translates the vector into a normalized motion vector Mx, My, Mz.
  • the application program processes the motion vector Mx, My, Mz into animated figure motion, and outputs the moving figure to the display 708.
  • An exemplary method for operating the pointing device 100 comprises the steps of:
  • the motion signals including signals representing translations in one or more dimension, or rotations in one or more dimensions, or a combination of translations and rotations;
  • all parameters measured by the sensors 108 in the pointing device 100 are mapped to motions of the virtual puppet 710, such as mouth moving open and closed, the motions of limbs, or the expression of emotional intensity.
  • the pressure measurement is particularly suitable for use in providing quantitative inputs to the computer to create results having continuously variable intensity.
  • a light squeeze may result in a small kick, whereas a hard squeeze results in a high kick.
  • FIG. 7 includes a hub 116 and at least a second computer input pointing device 100 capable of sensing motion in at least three dimensions and a pressure sensor. Both the first and second computer input pointing devices 100 are connected to the hub 116, which in turn is connected the input port of the computer 701. Each of the first and second computer input pointing devices 100 controls a respective virtual puppet 710 and 711.
  • the computer program code includes a range of motion tree 800 for the virtual puppet 710.
  • the range of motion tree 800 restricts a range of possible motion of the virtual puppet 710 about a current position of the virtual puppet.
  • the computer program code includes a respective unique range of motion tree for each of a plurality of users.
  • the exemplary range of motion tree 800 covers head, left and right shoulder, left and right elbow, left and right wrist, and left and right grip.
  • the tree limits the range of motion from each node.
  • the nodes may be delimited as follows:
  • the limits may be expressed in rectangular coordinates, such as (x0,y0,z0), (xl,yl,zl). Alternatively, the limits may be expressed as delta ranges. For some objects, it may be more convenient to express the limits in polar coordinates, which have a 1:1 mapping to rectangular coordinates.
  • the range of motion tree 800 can express a hierarchy of motions.
  • State information restricts the range of possible motion about the last position of puppet.
  • the state information is mapped to a user model, and further restricts possible ranges of motion. For example,
  • a user model adds probabilities for motion at each node of the tree. For example, the user model may identify whether the user moves her shoulders or shakes her wrists. In an exemplary user model for "User n”, the probabilities may be expressed as:
  • the probabilities of motion may be mapped to a voxel space.
  • the probabilities could be precompiled with user probabilities, where three-dimensional motion space is broken up into small cubes, analogous to three-dimensional pixels.
  • Each voxel point can be ascribed a probability, and each voxel can contains a range of motion probability.
  • Voxels may also include application restrictions on the range of motion probability.
  • Another task for the software is identifying the orientation of the pointing device, to allow use of the orientation as a parameter.
  • conformation of curves is used as a process of matching the signal curve over time from the three-dimensional sensor space to three- dimensional application space. This is analogous to a three-dimensional jigsaw puzzle or Chinese wood block puzzle, in which a piece may be rotated in any of three dimensions.
  • voxel space it is possible to constrain the orientation of the sensor motion curve to only possible ranges from the "range of motion tree" 800 to only allow possible ranges from the application space. It is further possible to find a most likely orientation of sensor motion curve based on probabilities of each motion represented as likelihood coefficients for each voxel point.
  • Orientation can be determined by software.
  • the exemplary software uses conformation mapping of curves of X, Y, Z (and optionally P) to the range of motions of either the left or right hand. Mapping could emanate from an explicit model or from trainable classifiers, such as neural nets.
  • An application may request the user to orient the pointing device 100 by holding the device in his or her left or right hand and selecting items on the display 708.
  • the user either controls an object on the display 708, or navigates in space, where the user acts as the object, with a point of view that shifts.
  • the range of object motion may be less than the range of motion of the pointing device 100.
  • the software maps the actual gesture of the input device to the best fitting object motion.
  • a simple example is controlling a cartoon character that merely has a range of motion of stage-left and stage-right.
  • the range of motion of the puppet is constrained to two dimensions.
  • Controlled objects on the display 708 generally have a limited destination range.
  • the application can show a picture of the pointing device 100 with the lead 112 pointing down to the floor.
  • One hemisphere of the squeezable, deformable cover may have special markings that faces away from the display 708. This technique eliminates the need for conformation of curves, since the orientation is 1:1, puppet to application.
  • FIG. 9 is a block diagram showing the virtual puppet modules and their connections to each other and the underlying software layer.
  • the user module 900 contains the states of:
  • a sensor module 902 has processes A, B, C, D corresponding to each sensor output, whether analog or digital.
  • a driver module 904 translates sensor values to application values.
  • the driver module 904 may use a trainable classifier, such as a neural network, where each sensor output has correlating inputs to the drivers idealized X, Y, Z, P.
  • the driver module 904 correlates user states to range of sensor input. The states, “grasp,” “orient,” “squeeze,” “move,” and “release” each have a characteristic effect on each of the sensors.
  • “grasp” may have a pressure curve, as the user grabs and picks up the puppet.
  • “squeeze” may have a longer continuous pressure curve when compared to grasp.
  • Each instance of state may have separate set of coefficients for a trainable classifier.
  • the software correlates the application state to the domain of sensor output, to reduce the set of possible outcomes. For instance, a two-dimensional stage for cartoon characters may be limited to "stage left” and “stage right”.
  • the program updates the user module state from the application state and application position.
  • the application module 906 accepts X, Y, Z, and P data from the driver module 904, and reports the application X, Y, Z, P state to the driver module as state information.
  • the application module 906 reports user state information to pass from the driver module 904 to the user module 900.
  • a mouse module 908 is a special program for emulating a mouse.
  • the mouse module 908 uses a special application module for the report state.
  • Mouse module 908 translates X, Y movement of the pointing device 100 to mouse XN motion, and translates Z and P parameters of the pointing device 100 to mouse Left, Right, and double click.
  • FIGS. 11-13 show the transformations that are performed by various exemplary software modules, depending on the specific types of sensors 902 included in the pointing device 100.
  • FIG. 11 covers the example described above, in which the motion detector measures x, y, and z accelerations, and pressure is also measured.
  • the sensors 902 include motion sensors 1100 and pressure sensor 1110.
  • a program Sm 1120 translates the raw output of sensor 1100 to an idealized motion vector, Mxi, Myi, Mzi, 1140 that is provided to the driver 904.
  • a second translator Tm 1160 translates the idealized motion to an application motion vector Mx, My, Mz 1180, which is provided to the application 906.
  • a program Sp 1130 translates the raw output of sensor 1110 to an idealized pressure Pi, 1150 that is provided to the driver 904.
  • a second translator Tp 1170 translates the idealized pressure Pi to an application pressure P 1190, which is provided to the application 906.
  • FIG. 12 covers a second example, in which an origami detector measures x, y, and z accelerations, and pressure.
  • the sensors 902' include unweighted sensors 1200 and weighted sensors 1210.
  • a program Sm 1220 discriminates between unweighted and weighted sensors to produce an idealized motion vector, Mxi, Myi, Mzi, 1240 that is provided to the driver 904'.
  • a second translator Tm 1260 translates the idealized motion to an application motion vector Mx, My, Mz 1280, which is provided to the application 906'.
  • a program Sp 1230 discriminates between unweighted and weighted sensors to produce an idealized pressure Pi, 1250 that is provided to the driver 904'.
  • a second translator Tp 1270 translates the idealized pressure Pi to an application pressure P 1290, which is provided to the application 906'.
  • FIG. 13 covers a third example, in which a tetrahedral detector measures x, y, and z accelerations; x, y and z rotations; and pressure.
  • the sensors 902" include tetrahedral sensor 1300.
  • a program Sm 1310 translates the output of sensor 1300 to an idealized motion vector, Mxi, Myi, Mzi, 1340 that is provided to the driver 904".
  • a second translator Tm 1370 translates the idealized motion to an application motion vector Mx, My, Mz 1371, which is provided to the application 906".
  • a third translator Sr 1320 translates the sensor information to an idealized rotation vector Rxi, Ryi, Rzi.
  • a fourth translator Tr 1380 translates the idealized rotation Rxi, Ryi, Rzi to an application rotation Rx, Ry, Rz 1350, which is provided to the application 906".
  • a fifth translator Sp 1330 translates the output of sensor 1300 to an idealized pressure Pi, 1360 that is provided to the driver 904".
  • a sixth translator Tp 1390 translates the idealized pressure Pi to an application pressure P 1391, which is provided to the application 906".
  • FIG. 14 is a flow chart diagram of an exemplary method for operating the pointing device 100.
  • the system waits for the user to take an action, i.e., a motion or a squeeze.
  • the system returns to the wait state 1419 after each action, and can transition from the wait state 1419 to any of the states, "grasp” 1420, "orient” 1432, “squeeze” 1437, “move” 1442 or “release” 1447.
  • the user grasps the device.
  • the sensor 902 registers the motion.
  • the driver module 904 wakes up in "grasp mode".
  • step 1423 if this is the first session for the pointing device 100, then step 1424 is executed, and the device is calibrated. Control is transferred to step 1425. At step 1425, if this is the first time the user is using the pointing device 100, then at step 1426, the user registers himself or herself. An association is thus formed between the user and the particular pointing device, making use of the (optional) unique identifier chip in the pointing device 100. Control is transferred to step 1427. At step 1427, if a login is required, then step 1428 is executed. At step 1428, the user logs in, either by using the keyboard, or by a unique motion with the pointing device 100 (such as writing the user's signature). Control is transferred to step 1429.
  • step 1429 the system checks whether there are user-specific coefficients available on the system for translating this specific user's style of motion, rotation and pressure from raw sensor data to idealized measurement data. If data are available, then step 1430 is executed. At step 1430, the default coefficients are replaced by the previously determined user-specific coefficients. Control is transferred to step 1431. At step 1431, the user login is translated to rotation and orientation coefficients. The system returns to the wait state 1419
  • step 1432 the user orients the device.
  • a rotation calibration is initiated by asking the user to orient the device in one or more predetermined positions.
  • step 1434 a stream of data points are recorded.
  • step 1435 conforming X, Y and Z rotations are constructed.
  • step 1436 a determination is made whether sufficient data have been collected to conform the rotation measurements to the actual rotations. If not, then control is returned to step 1433. Otherwise, the system returns to the wait state 1419.
  • step 1437 the user squeezes the pointing device 100.
  • the driver module 904 updates the idealized pressure Pi, and translates the same to the normalized pressure P.
  • step 1439 the pressure is provided to the application, which processes the pressure.
  • step 1440 if the application is finished with processing the squeeze operation, then control returns to step 1419 (the wait state). If not, then step 1441 is executed, to determine whether driver module 904 detects a lull or pause in the squeezing. If there is a pause, then control is transferred to step 1419 (wait state). Otherwise, step 1438 is executed again.
  • step 1442 the user moves the pointing device 100.
  • the driver module 904 updates the idealized motion vector Mxi, Myi, Mzi, and translates the same into the normalized motion vector Mx, My, Mz.
  • the application program processes the normalized motion vector Mx, My, Mz.
  • step 1445 if the application is done with processing the movement of the pointing device 100, then control transfers to step 1419 (wait state). Otherwise, step 1446 is executed, to determine whether the driver detects a lull in the motion. If there is a pause, then control is transferred to step 1419 (wait state). Otherwise, step 1443 is executed again.
  • the user releases the pointing device 100.
  • the driver module 904 enters a sleep state.
  • the application is notified that the pointing device has entered the sleep state.
  • a set of pseudo-code that may be used to construct a software system suitable for handling input signals from the exemplary pointing device 100.
  • A, B, C, D, E, F, G sensor strip oriented along a sensing plane
  • Tm translate idealized motion to application motion
  • Tp translate idealized pressure to application pressure
  • Tr translate idealized rotational motion to application rotational motion
  • Tm (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz)
  • Tp Pi maps to P Origami sensor with ABC flaps with each flap having an un-weighted (u) and weighted (w) section
  • Tm (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz)
  • Tm (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz) Tr: (Rxi, Ryi, Rzi) rotationally maps to (Rx, Ry, Rz)
  • Tp Pi maps to P
  • Registration after calibration ask user to create login signature determine cultural orientation left-right vs. right-left, etc. accept user's input while software collects ABCD points translate input to Ox, Oy, Oz over time
  • Login ask user to reproduce login signature set orientation based upon cultural left-right vs. right-left context replaces manual orientation to draw shape on X-Y axis
  • the output of the pointing device may be mapped to the parameters of a visual synthesizer, such as the shifting of a color palette, the drawing or motion of an object.
  • the output of device 100 may be mapped to parameters of a sound synthesizer, such as pitch, timbre, amplitude, or placement in three-dimensional sonic space.
  • the exemplary virtual puppet application is configured so that one pointing device 100 controls a single virtual puppet 710
  • the software can be configured to operate more than one virtual puppet (or other object) using a single input pointing device 100.
  • the pointing device includes sensors for measuring x, y, and z accelerations and pressure P
  • the first puppet 710 may be controlled by the x and y accelerations
  • the second puppet 711 may be controlled by z and P.
  • the first and second objects need not be identical objects, or even objects of the same type.
  • the first object may be a virtual puppet
  • the second object may be the background scenery or sound.
  • the puppet 710 is controlled, and by z and P, the user can vary the background from day to night, or from silence to the sound of thunder.
  • software applications can us the Mouse API for selecting or navigating or use a Driver API.
  • the driver for Digitizer (or Graphics) tablets may allow more than one pen to be used at the same time.
  • Emulating a mouse can be accomplished by installing a driver and going beyond a mouse can be accomplished by modifying the application program.
  • One can also overlay other input devices, such as a graphics tablet, with yet another driver.
  • the output signals from the exemplary pointing device 100 may be mapped to those of conventional input devices in an emulation mode.
  • the outputs may be mapped to the x and y positions of a conventional mouse and the left or right click signals from depressing the mouse buttons.
  • Mapping the x, y and z (or x, y and P) parameters into x and y parameters only may require the computer to determine the orientation of the pointing device 100, such as moving on a predominantly x-y plane, a y-z plane, or the z-x plane, or combinations of all three.
  • individual user profiles may be used to determine the current orientation.
  • a second pointing device may be used to expand the number of parameters for an application.
  • one pointing device 100 may control various aspects of the video signal, while another pointing device may be used to control the audio signals.
  • a microphone may be incorporated into the pointing device, for operation of voice activated software, or for storage of sounds.
  • the exemplary embodiment described above has the virtual puppet application program running locally in the computer 701, the application program may be executed in other processors located remotely, either as a sole process or in parallel with other instances of the application program on other processors.
  • the local computer can communicate with other computers via a telecommunication network, which may be a LAN, WAN, Internet, or wireless protocols.
  • the present invention may be embodied in the form of computer-implemented processes and apparatus for practicing those processes.
  • the present invention may also be embodied in the form of computer program code embodied in tangible media, such as floppy diskettes, read only memories (ROMs), CD-ROMs, hard drives, "ZLPTM drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention.
  • the present invention may also be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over the electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention.
  • computer program code segments configure the processor to create specific logic circuits.

Abstract

A hand-held computer input pointing device (100) has at least one motion detector (314), the at least one motion detector being capable of detecting motion in at least three dimensions. At least one pressure sensor (316) is capable of sensing pressure quantitatively. The input device is operable within a hand of a user to transmit signals from the motion detector and the pressure sensor, without contacting a base, work surface or pad. The pointing device may, for example, be used to control a computer (701) executing a program to display a movable virtual puppet (710).

Description

HUMAN GESTURAL INPUT DEVICE WITH MOTION AND PRESSURE
FIELD OF THE j-NVENTION The present invention relates the field of input devices for computers.
DESCRIPTION OF THE RELATED ART
Computer input pointing devices are well known. A typical computer input pointing device contains at least two sensors for sensing motion in at least two directions, such as x and y (forward-backward and left-right). The pointing device also has at least one actuator for causing the pointing device to transmit a command signal (typically referred to as a "click") to a computer."
The most common pointing device for a desktop computer or workstation is a mouse. The mouse may have a ball on its underside. Motion of the ball causes rotation of one or more of a plurality of rotation sensors adjacent to the ball. The number of rotations determine the magnitude of the motion in the x or y direction. The most common mouse type includes two buttons for sending commands known as "left-click" and "right-click. Other types of mice include optical sensors that count markings on a specially marked mouse pad to determine an amount of movement.
Another common pointing device used primarily for computer games is a joystick. The joystick has a lever that is tilted in a forward-backward or left-right direction, each of which is sensed independently. A button is typically provided on the end of the lever for transmitting a command signal. Some joysticks also allow rotation of the lever, which is also sensed.
A variety of alternative pointing devices have been developed. Alternative pointing devices are typically used on laptop computers. An early pointing device ύsδd on laptop computers is the track ball. The track ball functions like an upside-down mouse. Instead of moving the device, the ball is rotated directly.
Many laptop computers include a miniature joystick that is positioned on the home row of the keyboard. Another common pointing device in laptop computers is a touch pad. The touch pad is a rectangular device that is sensitive to touch. Left and right sliding motion of a finger on the touch pad is detected. The touch pad also senses when it is struck, and produces a "click" signal. Although conventional pointing devices are suitable for locating a point on an x-y grid, and transmitting a simple single-valued command signal, conventional pointing devices leave much to be desired for controlling application programs that require more complex inputs.
SUMMARY OF THE INVENTION
One aspect of the present invention is a hand-held computer input pointing device, comprising at least one motion detector. The at least one motion detector is capable of detecting motion in at least three dimensions. At least one pressure sensor is capable of sensing pressure quantitatively. The input device is operable within a hand of a user to transmit signals from the motion detector and the pressure sensor, without contacting a base, work surface or pad.
Another aspect of the invention is a method for operating a computer having an output device, comprising the steps of: receiving a plurality of signals from a computer input pointing device, the signals representing quantifiable pressure and motion in at least three dimensions; and outputting a video, audio or tactile output signal to the output device, the output signal having at least two characteristics that are capable of being varied separately from each other.
Still another aspect of the invention is a method for operating a computer input pointing device, comprising the steps of: moving the pointing device; transmitting at least three motion signals from the pointing device to a computer, the motion signals including signals representing translations in one or more dimension, or rotations in one or more dimensions, or a combination of translations and rotations; squeezing at least one portion of the pointing device; and transmitting quantifiable pressure signals from the pointing device to the computer.
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is an isometric view of an exemplary squeezable pointing device according to the present invention.
FIG. 2 is an exploded view of the pointing device of FIG. 1.
FIG. 3A is an enlarged view of the origami sensor of FIG. 2.
FIG. 3B is a plan view of the sensor of FIG. 3 A, prior to folding.
FIG. 4A is a plan view of a tetrahedral origami sensor in a flattened state.
FIG. 4B is an isometric view of the sensor of FIG. 4A, folded. FIG. 5A is a plan view of an alternative cubic origami sensor in a flattened state.
FIG. 5B is an isometric view of the sensor of FIG. 5A, folded.
FIG. 6 is an isometric view of a tetrahedral edge sensor.
FIG. 7 is an elevation view of a computer system including the pointing device of FIG. 1.
FIG. 8 is a diagram of a range of motion tree for an application program executed on the computer shown in FIG. 7.
FIG. 9 shows the software layers for the application executed on the computer shown in FIG. 7.
FIG. 10 is a flow chart of a method for operating the virtual puppet of FIG. 7.
FIGS. 11-13 are block diagrams of the software layer structure shown in FIG. 9.
FIG. 14 is a flow chart of a method for operating the pointing device of FIG. 2.
DETAILED DESCRIPTION
FIG. 1 shows an exemplary computer input pointing device 100 according to the present invention. Pointing device 100 is capable of detecting motion and providing a quantifiable measure of squeeze pressure. The input device 100 is operable in a hand of a user, without contacting a base, work surface, or pad. Thus, the input device 100 is "free floating," in that it may be operated in any position without regard to the presence of a base, work surface, or pad. Pointing device 100 has a squeezable, deformable elastomeric cover 102. Inside the cover 102, the pointing device 100 has an internal motion detector that may measure acceleration in x, y and z planes, or measure pitch, yaw and roll. Squeezing the package increases pressure that translates into a pressure signal that is sent by the pointing device 100 to the computer (shown in FIG. 7) along with the motion data.
FIG. 2 is an exploded view of the computer input pointing device 100, with the squeezable elastomeric covering 102 partially removed. The squeezable covering 102 may be formed from natural or synthetic rubber or a variety of polymeric materials, which may be cut or injection molded. Cover 102 has an internal cavity 104 for containing sensors. An optional sensor enclosure 106 may be provided inside of cover 102. The sensor enclosure 106 may be sealed to protect the sensors from dust and moisture, allowing use of an unsealed cover 102. The deformable cover 102, with or without the sensor enclosure 106, allows the sensor to move freely, and compresses internal sensor. In one embodiment, the squeezable cover is made from a "SQEESH™" ball by the Toysmim Company of Kent, WA USA. "SQEESH™" balls are available in a variety of shapes and surface types, including smooth spherical, spherical with craters, and star shaped. A variety of shapes may be used to provide users with the grip of their choice or hints as to what object that the device controls.
Pointing device 100 has at least one motion detector 108, which is capable of detecting motion in at least three dimensions. Pointing device 100 also includes at least one pressure sensor capable of sensing pressure quantitatively. In the example, the pressure sensing capability is also provided by the sensor 108, which is described in greater detail below. The at least one motion detector may, for example, include an accelerometer or a piezoelectric device.
As best seen in FIG. 3 A, the exemplary piezoelectric device 108 includes at least three piezoelectric strips 301-303 or flaps oriented in three orthogonal directions. More specifically, the exemplary device 108 is a single film sensor having three strips 301-303 folded origami style, as shown in FIG. 3A, to form a piezo thin film with six sensors.
FIG. 3B is a view of the device 108 with six sensors 314, 316. The device is formed on a substrate 300 and is shown laid flat before the origami folds along dashed lines 312. Optional weight W1-W3 may be attached to the end or center of each flap 301-303, respectively. Preferably, each of the three piezoelectric strips 301-303 has at least two sections, including a weighted section 314 and an unweighted section 316; so as to discriminate motion load from pressure load.
The piezo film substrate 300 is plastic and allows two strips of piezo material to measure bend of the substrate by the relative tension between two strips. One piezo sensor 316 is provided for pressure discrimination. The pressure on the deformable covering 102 bends the material of the piezo pressure sensor 316, producing a measurable voltage. Another piezo sensor 314 for motion discrimination has a weight W1-W3 placed at the middle or end of the sensor. If each of the three piezoelectric strips 314 has a respective weight; moving the pointing device 100 induces an inertial force. The weight W1-W3 moves against the material of the motion sensor 314, causing it to bend at a different rate than pressure sensor 316.
Cuts 318 on the exemplary piezo film substrate 300 allow each sensor flap 314, 316 to bend separately. The dotted lines 312 show folds where the flat sensor strip 300 is origami folded in 90 degree bends. The folded origami flaps 301-303 are affixed to a support assembly 320 (shown in FIG. 3A). The support assembly 320 holds the origami flaps 301-303 in place. The leads L from thin film sensor are connected to optional internal multiplexing circuitry 110 that connects the sensor circuitry to an external lead 112.
Referring again to FIG. 1, the sensors output data representing four parameters (three motion and one pressure). Preferably, a multiplexer 110 receives output signals from the motion detector and the pressure sensor 108, and outputs a multiplexed signal for transmission to a computer (shown in FIG. 7). The output of the multiplexer is connected to a lead 112. The lead 112 extends through a channel 103 in the elastomeric covering 102. The opposite end of the lead 112 has a suitable connector 114 attached to it, which may be a 0.32-centimeter (1/8 inch) miniplug, a Universal Serial Bus (USB) connector, a serial (RS-232) port, an "ETHERNET™" port, a MIDI port for connection to a musical instrument, a PDA port, a cellular phone port or other suitable connector.
In a system having a plurality of similar pointing devices 100, a hub capable of accepting inputs from multiple input devices may be provided to interface the plurality of pointing devices 100 to a single computer. For example, if there are multiple pointing devices having USB connectors, a USB hub 116 (connected to lead 118 and USB connector 120) may be used to connect the pointing devices to a single computer. Alternatively, the pointing device may omit the multiplexer, and output four analog signals from four respective output terminals. The pointing device 100 may have a wireless transmitter for communicating with the computer.
The lead 112 from the sensing package either connects one data line (carrying time or frequency multiplexed data), or multiple data lines each carrying a respective sensor's output. The connector (such as 1/8" miniplug) 114 optionally connects directly to the computer (FIG. 7) through a microphone input eliminating the need for a hub. The pointing device 100 may use the audio software driver or communicate through the optional hub 116 for controlling multiple puppets (as described below). The connector 114 may connect to a different computer port such as a USB or serial port, eliminating the need for a hub. Although FIG. 2 shows a two-input version, the system may include any number of inputs.
Any of the sensor types described herein may be used with clock shifting to the next lead. An additional clock cycle may be provided for subsequent processing of an optional unique identification number (described below). The sensors provide an analog signal that may be modulated with a single carrier frequency, or the signal from each lead may be modulated with a respectively different carrier frequency.
An optional analog to digital converter (ADC) may be provided for each of the analog sensor leads. In this case, a clock shift to the next lead sends a digital packet. For example, with eight bits per lead, which for 6 sensors yields six bytes per sample cycle.
An optional identifier (LD) circuit may be provided. The ID circuit may be a register or a read only memory (ROM) device, EEPROM, or the like. The unique identifier may be hard encoded at the factory (e.g., using a ROM), or the identifier may be downloaded from a computer into a flash EEPROM in the pointing device 100. As explained below with reference to FIG. 7, by providing a unique identifier for each pointing device unit 100, more than one unit may be used at the same time. The device identifier can function like an encryption key, to protect the user's preferences or to securely identify the user.
The output of the LD circuit and the output from one of the sensors are output to the multiplexing circuitry 110. The analog input may be time-domain multiplexed with the ID waveform. Alternatively, if the analog sensor signals are converted to digital form before transmission to the multiplexer, a digital input packet may be combined with ID information to form a larger packet including both. The pointing device 100 may contain additional identifier circuitry to add a port identifier number separate from the pointing device identifier number.
The analog waveform may have a separate carrier frequency or may be time domain multiplexed or interleaved with each sample, either on a bit, byte, or packet basis. Alternatively, if a digital packet format is used, the sensor data and ID information may be added to a multiplexing packet to form a larger packet including both types of data.
The pointing device 100 may obtain power from the computer (FIG. 7), from an internal battery, or by drawing power directly from the energy created by the sensor from the piezoelectric effect.
ALTERNATIVE SENSORS
The piezoelectric device may be an origami sensor having cut coils, and may have either a cubic or tetrahedral shape. Preferably, the cut coils on each face of the motion detector are attached to a single weight at a center of the motion detector. Optionally, each cut coil on each face of the motion detector has its own weight attached.
FIGS. 5 A and 5B show a sensor system 500 including origami cutouts, which may be used in place of the sensor 108 shown in FIG. 2. A thin sensor film 500 having six faces 501- 506 is folded to form a cube (although a tetrahedron, shown in FIGS. 4A and 4B, or other three- dimensional shapes may be used in alternative embodiments). A spiral pattern 511-516 is cut into each face of thin sensor film 501-506, respectively. Each coil can be pressed towards the center (e.g., Cl, C3 and C6) and attached to a weight that is suspended at the center of the folded shape.
When the pointing device is moved, the weight applies forces to each of the coils eliciting a voltage. For the cube 500, each face 501-506 has a corresponding opposite face measuring force with an inverse effect on voltage.
For the tetrahedron shown in FIGS. 4A and 4B, each side faces three other sides, measuring an inverse effect on their faces. Additional circuitry (not shown) may be included in the exemplary sensor configuration. An analog sample and hold is provided for each of the analog sensor leads. For the exemplary origami sensor of FIGS. 2, 3A and 3B, the leads are:
1. A free - for pressure discrimination
2. A weighted - for motion discrimination
3. B free - pressure
4. B weighted - motion
5. C free - pressure
6. C weighted - motion
For a tetrahedral sensor (FIGS. 4 A and 4B), the leads are:
1. A tension
2. B tension
3. C tension
4. D tension
For a cubic sensor (FIGS. 5A and 5B), the leads are:
1. A0 tension
2. Al tension
3. BO tension
4. B2 tension
5. CO tension
6. tension
FIG. 6 shows a tetrahedral edge sensor 600. Sensor 600 has six edges to detect pressure directly from a squeeze of the deformable elastomeric material of cover 102. Optionally, sensor 600 may have weighted edges 602 to detect motion through inertial force whenever the pointing device 100 is moved. The weights 610 increase the momentum of the sensors 602, so that the deformation of the sensors is exaggerated. Alternatively, weights may be placed at each node 608.
Sensor 600 is used in conjunction with an application driver that can discriminate pressure from motion. The driver can discriminate pitch, yaw, and roll. A plurality of signal leads 604 are provided, one for each edge of the tetrahedron 600. The outputs 604 of the sensor either connect to multiplexing/conversion circuitry 110 or lead out to separate analog lines. There is an input lead from each edge of tetrahedron. Optionally each connecting node 608 broadcasts a wireless signal of its state without requiring any lead wire. As shown in FIG. 6 by dotted lines 606, non-sensing leads connect each node 608 to another edge 602. The non-sensing leads 606 either follow a sensing edge 602 to another node 608 or may run directly to another node 608. The connecting nodes 608 for leads 606 and sensing edges 602 may simply be vertices for an origami tetrahedron or connecting nodes for self supporting edge sensors.
In one embodiment, a sensor similar to that used in the "GYROPOENT™" mouse may be used.
In another embodiment, a sensor such as the "ACH-04-08-05" accelerometer/shock sensor from Measurement Specialties, Inc. of Valley Forge PA may be used. The ACH-04-08-05 sensor has three piezoelectric sensing elements oriented to measure acceleration in x, y and z linear axes, and uses very low power. Alternatively, an "ACH-04-08-01" sensor by Measurement Specialties, Inc. may be used to measure linear acceleration in the x and y directions and rotation about the z axis.
In addition to the motion detection capability, pointing device 100 also includes a pressure sensing capability. An internal pressure sensor is included, which may be, for example, a thin film piezo sensor that detects a bending of a strip or a sensor such as the "MTC Express™" from Tactex Controls Inc. of Cerritos, CA. This control surface is pressure sensitive, and also senses multiple points of contact, using a fiber optic-based, pressure sensitive, Smart Fabric called "KTNOTEX™." This fabric is constructed of cellular urethane or silicone sandwiched between protective membranes. The fabric has an embedded fiber optic array that generates an optical signal when the material is touched, or when force is applied. The coordinates and area of the pressure, and its magnitude can be determined from the received signals. The material can generate a pressure profile of the shape of a hand touching it.
As noted above, the pointing device 100 may include a circuit that outputs a unique identifier for each pointing device. There are many applications for this feature. For example, the feature allows the pointing device 100 to serve a user identification and/or authentication function. A user could carry around his or her pointing device with a unique identifier output signal, and connect it to any desired computer having an Internet connection. The application program could request a remote server to download previously established software preferences or upload the user's current software preferences. This enables a person to log on at any computer connected to the Internet, access his or her software (e.g., virtual puppet and stage application), or download web bookmarks. Alternatively, the user could use the pointing device 100 as a key to access copy protected software, functioning like a parallel port dongle. Thus, the use of a pointing device 100 with a built-in unique identifier enhances the user's mobility and access to services. Further, the pointing device may be used for authentication at any computer to which it is connected. For example, because the unit uniquely identifies itself, the unit can be used in a way that allows it to bind to each individual user. For example, the unique identifier may be used in combination with a password, IP address or a user signature. For example, the user may sign his or her name using the pointing device to control movement of a virtual writing implement. The user's handwriting may be recognized using a conventional handwriting recognition algorithm, such as any of those described in U.S. Patents 5768423, 5757959, 5710916, 5649023 or 5553284, all of which are expressly incorporated by reference herein in their entireties. The combination of the unique device identifier and the user signature provides reliable authentication.
Further still, the unique identifier may be used to make the user's preferences more portable. For example, the identifier may be associated with the design (i.e., shape, color, and the like) of the cover 102. The identifier for the pointing device may also be associated with different shapes and colors of cursors. The cursor that appears on the screen may have the same shape or color as the cover 102 of the input device 100, which is associated with a known identifier. For example, a blue spherical pointing device may be associated with a round, blue cursor, a puppet shaped pointing device may be associated with a puppet shaped cursor, an alien-shaped pointing device may be associated with an alien-shaped cursor, and so on.
Alternatively, the input device packaging may be completely different from the shape and color of the computer cursor, or puppet object. If the puppet application program is stored in a server that is accessible via the Internet, or other local area network (LAN) or wide area network (WAN) or wireless services such as CDMA, the user may select from any of a plurality of different puppets from a gallery. This is one of the user preferences that may follow her around from client computer to client computer.
VIRTUAL PUPPET SYSTEM
FIG. 7 shows a computer system 700 according to another aspect of the invention. The computer system 700 includes a computer 701 having a processor 702, a memory 706, a display 708, and an input port 712.
At least one computer input pointing device 100, as described above with reference to FIG. 2, is coupled to the input port 712 of the computer 701. The pointing device 100 has at least one motion detector 108, capable of detecting motion in at least three dimensions, and at least one pressure sensor capable of sensing pressure quantitatively. A multiplexer 110 receives output signals from the motion detector 108 and the pressure sensor, and outputs a signal for transmission to the computer 701. Alternatively, the lead from each sensor may output an analog signal.
The computer 701 further comprises a storage device 704, which may be a hard disk drive, a high capacity removable disk drive (e.g., "ZIP" drive by Iomega Corporation), a read only memory (ROM), CD-ROM drive, a digital versatile disk (DVD) ROM drive, or the like. Either the memory 706 or the storage device 704 has stored therein computer program code for causing the computer to show an object 710 on the display 708. The object has a plurality of portions 710a-710d that are selectively movable in response to motion or squeezing of the pointing device 100. The exemplary object is an animated character 710 controlled by input parameters from the pointing device. Such an animated character is referred to herein as a "virtual puppet". The virtual puppet has a plurality of independently movable body parts 710a-710d. Although the example only shows movable arms and legs, any part of the virtual puppet 710 may be movable.
According to another aspect of the invention, a method is provided for controlling a computer. The method includes the steps of: receiving a plurality of signals from a computer input pointing device 100, the signals representing quantifiable pressure and motion in at least three dimensions; and outputting a video, audio or tactile output signal to the output device 100, the output signal having at least two characteristics that are capable of being varied separately from each other.
FIG. 10 shows an exemplary method for operating the computer 701. The method includes the steps of:
• receiving a plurality of signals from a computer input pointing device 100, the signals representing quantifiable pressure and motion in at least three dimensions;
• displaying a virtual puppet 710 on the display 708, the virtual puppet having at least two portions 710a-710d that are capable of being moved separately from each other; and
• showing movement of the portions 710a-710d of the virtual puppet 710 in response to the pressure and motion signals.
At step 1010, the animated character 710, figure or other object is displayed on the display 708. At step 1020, when the user squeezes pointing device 100, stejjs 1030-1050 are executed. At step 1030, the driver module 904 updates the idealized pressure value Pi and translates Pi into the normalized pressure value P. At step 1040, the application program processes the normalized pressure P into an animated figure action. At step 1050, the application updates the animated figure according to the action, and displays the result on the display 708.
At step 1060, when the user moves the pointing device 100, steps 1070-1080 are executed. At step 1070, the driver module 904 of the software updates the idealized motion vector Mxi, Myi, Mzi and translates the vector into a normalized motion vector Mx, My, Mz. At step 1080, the application program processes the motion vector Mx, My, Mz into animated figure motion, and outputs the moving figure to the display 708.
An exemplary method for operating the pointing device 100 comprises the steps of:
1. moving the pointing device 100;
2. transmitting at least three motion signals from the pointing device 100 to a computer 701, the motion signals including signals representing translations in one or more dimension, or rotations in one or more dimensions, or a combination of translations and rotations;
3. squeezing at least one portion of the pointing device 100; and
4. transmitting quantifiable pressure signals from the pointing device 100 to the computer 701.
Preferably, all parameters measured by the sensors 108 in the pointing device 100 are mapped to motions of the virtual puppet 710, such as mouth moving open and closed, the motions of limbs, or the expression of emotional intensity. The pressure measurement is particularly suitable for use in providing quantitative inputs to the computer to create results having continuously variable intensity. Thus, a light squeeze may result in a small kick, whereas a hard squeeze results in a high kick.
The example of FIG. 7 includes a hub 116 and at least a second computer input pointing device 100 capable of sensing motion in at least three dimensions and a pressure sensor. Both the first and second computer input pointing devices 100 are connected to the hub 116, which in turn is connected the input port of the computer 701. Each of the first and second computer input pointing devices 100 controls a respective virtual puppet 710 and 711.
As shown in FIG. 8, the computer program code includes a range of motion tree 800 for the virtual puppet 710. The range of motion tree 800 restricts a range of possible motion of the virtual puppet 710 about a current position of the virtual puppet. Preferably, the computer program code includes a respective unique range of motion tree for each of a plurality of users.
The exemplary range of motion tree 800 covers head, left and right shoulder, left and right elbow, left and right wrist, and left and right grip. The tree limits the range of motion from each node. For example, the nodes may be delimited as follows:
1. Head Hxyz
2. Shoulder Sxyz 3. Elbow Exyz
4. Wrist Wxyz
Figure imgf000014_0001
The limits may be expressed in rectangular coordinates, such as (x0,y0,z0), (xl,yl,zl). Alternatively, the limits may be expressed as delta ranges. For some objects, it may be more convenient to express the limits in polar coordinates, which have a 1:1 mapping to rectangular coordinates.
The range of motion tree 800 can express a hierarchy of motions. For example, in the case of the head, the range of motion may be express as: (HxO, Hxl), (HyO, Hyl), (HzO.Hzl) => H* where H* denotes range of motion
For the shoulder , the range of motion may be express as: (SxO, Sxl), (SyO, Syl), (SzO.Szl) => S*
One of ordinary skill can readily see that similar ranges of motion can be specified for elbow, wrist, and grip
All motion then extends the range of the tree H*.S*.E*.W*.G*
State information restricts the range of possible motion about the last position of puppet. The state information is mapped to a user model, and further restricts possible ranges of motion. For example,
Hxyz:H* => (HxOO,Hxl l), (HyOO Hyll), (HzOO,Hzll)
A user model adds probabilities for motion at each node of the tree. For example, the user model may identify whether the user moves her shoulders or shakes her wrists. In an exemplary user model for "User n", the probabilities may be expressed as:
Un: (x0,x0), (yO.yl), (z0,zl)
As an alternative to the tree search, the probabilities of motion may be mapped to a voxel space. The probabilities could be precompiled with user probabilities, where three-dimensional motion space is broken up into small cubes, analogous to three-dimensional pixels. Each voxel point can be ascribed a probability, and each voxel can contains a range of motion probability. Voxels may also include application restrictions on the range of motion probability.
Another task for the software is identifying the orientation of the pointing device, to allow use of the orientation as a parameter. In one embodiment, conformation of curves is used as a process of matching the signal curve over time from the three-dimensional sensor space to three- dimensional application space. This is analogous to a three-dimensional jigsaw puzzle or Chinese wood block puzzle, in which a piece may be rotated in any of three dimensions. Using voxel space, it is possible to constrain the orientation of the sensor motion curve to only possible ranges from the "range of motion tree" 800 to only allow possible ranges from the application space. It is further possible to find a most likely orientation of sensor motion curve based on probabilities of each motion represented as likelihood coefficients for each voxel point.
Orientation can be determined by software. The exemplary software uses conformation mapping of curves of X, Y, Z (and optionally P) to the range of motions of either the left or right hand. Mapping could emanate from an explicit model or from trainable classifiers, such as neural nets. An application may request the user to orient the pointing device 100 by holding the device in his or her left or right hand and selecting items on the display 708.
The user either controls an object on the display 708, or navigates in space, where the user acts as the object, with a point of view that shifts. In either case, the range of object motion may be less than the range of motion of the pointing device 100. Thus, the software maps the actual gesture of the input device to the best fitting object motion.
A simple example is controlling a cartoon character that merely has a range of motion of stage-left and stage-right. Here, the range of motion of the puppet is constrained to two dimensions. There are several phases of control that should be considered to gauge its affect in the virtual world. For example: whether the pointing device is operated as a left-handed device or a right-handed device; whether the line 112 is sticking up or down; the current model or user range of motions; the current state of pressure; and the current desired range of controlled object motion. Controlled objects on the display 708 generally have a limited destination range.
As an alternative to the range of motion tree 800, one can require manual orientation of the pointing device 100. For example, the application can show a picture of the pointing device 100 with the lead 112 pointing down to the floor. One hemisphere of the squeezable, deformable cover may have special markings that faces away from the display 708. This technique eliminates the need for conformation of curves, since the orientation is 1:1, puppet to application.
FIG. 9 is a block diagram showing the virtual puppet modules and their connections to each other and the underlying software layer. The user module 900 contains the states of:
1. grasp - The user is picking up device.
2. orient - The user is matching device orientation to application and screen feedback
3. Squeeze - The user is pressing in on squishy covering
4. Move - The user is moving device 5. Release - The user is putting down the device
6. state - The module reports user state to other modules
A sensor module 902 has processes A, B, C, D corresponding to each sensor output, whether analog or digital.
A driver module 904 translates sensor values to application values. The driver module 904 may use a trainable classifier, such as a neural network, where each sensor output has correlating inputs to the drivers idealized X, Y, Z, P. The driver module 904 correlates user states to range of sensor input. The states, "grasp," "orient," "squeeze," "move," and "release" each have a characteristic effect on each of the sensors.
For instance, "grasp" may have a pressure curve, as the user grabs and picks up the puppet. As another example, "squeeze" may have a longer continuous pressure curve when compared to grasp. Each instance of state may have separate set of coefficients for a trainable classifier.
The software correlates the application state to the domain of sensor output, to reduce the set of possible outcomes. For instance, a two-dimensional stage for cartoon characters may be limited to "stage left" and "stage right". The program updates the user module state from the application state and application position.
The application module 906 accepts X, Y, Z, and P data from the driver module 904, and reports the application X, Y, Z, P state to the driver module as state information. Optionally, the application module 906 reports user state information to pass from the driver module 904 to the user module 900.
A mouse module 908 is a special program for emulating a mouse. The mouse module 908 uses a special application module for the report state. Mouse module 908 translates X, Y movement of the pointing device 100 to mouse XN motion, and translates Z and P parameters of the pointing device 100 to mouse Left, Right, and double click.
FIGS. 11-13 show the transformations that are performed by various exemplary software modules, depending on the specific types of sensors 902 included in the pointing device 100.
FIG. 11 covers the example described above, in which the motion detector measures x, y, and z accelerations, and pressure is also measured. The sensors 902 include motion sensors 1100 and pressure sensor 1110. A program Sm 1120 translates the raw output of sensor 1100 to an idealized motion vector, Mxi, Myi, Mzi, 1140 that is provided to the driver 904. A second translator Tm 1160 translates the idealized motion to an application motion vector Mx, My, Mz 1180, which is provided to the application 906. A program Sp 1130 translates the raw output of sensor 1110 to an idealized pressure Pi, 1150 that is provided to the driver 904. A second translator Tp 1170 translates the idealized pressure Pi to an application pressure P 1190, which is provided to the application 906.
FIG. 12 covers a second example, in which an origami detector measures x, y, and z accelerations, and pressure. The sensors 902' include unweighted sensors 1200 and weighted sensors 1210. A program Sm 1220 discriminates between unweighted and weighted sensors to produce an idealized motion vector, Mxi, Myi, Mzi, 1240 that is provided to the driver 904'. A second translator Tm 1260 translates the idealized motion to an application motion vector Mx, My, Mz 1280, which is provided to the application 906'. A program Sp 1230 discriminates between unweighted and weighted sensors to produce an idealized pressure Pi, 1250 that is provided to the driver 904'. A second translator Tp 1270 translates the idealized pressure Pi to an application pressure P 1290, which is provided to the application 906'.
FIG. 13 covers a third example, in which a tetrahedral detector measures x, y, and z accelerations; x, y and z rotations; and pressure. The sensors 902" include tetrahedral sensor 1300. A program Sm 1310 translates the output of sensor 1300 to an idealized motion vector, Mxi, Myi, Mzi, 1340 that is provided to the driver 904". A second translator Tm 1370 translates the idealized motion to an application motion vector Mx, My, Mz 1371, which is provided to the application 906". A third translator Sr 1320 translates the sensor information to an idealized rotation vector Rxi, Ryi, Rzi. A fourth translator Tr 1380 translates the idealized rotation Rxi, Ryi, Rzi to an application rotation Rx, Ry, Rz 1350, which is provided to the application 906". A fifth translator Sp 1330 translates the output of sensor 1300 to an idealized pressure Pi, 1360 that is provided to the driver 904". A sixth translator Tp 1390 translates the idealized pressure Pi to an application pressure P 1391, which is provided to the application 906".
FIG. 14 is a flow chart diagram of an exemplary method for operating the pointing device 100.
At step 1419, the system waits for the user to take an action, i.e., a motion or a squeeze. The system returns to the wait state 1419 after each action, and can transition from the wait state 1419 to any of the states, "grasp" 1420, "orient" 1432, "squeeze" 1437, "move" 1442 or "release" 1447.
At step 1420, the user grasps the device. At step 1421 the sensor 902 registers the motion. At step 1422, the driver module 904 wakes up in "grasp mode".
At step 1423, if this is the first session for the pointing device 100, then step 1424 is executed, and the device is calibrated. Control is transferred to step 1425. At step 1425, if this is the first time the user is using the pointing device 100, then at step 1426, the user registers himself or herself. An association is thus formed between the user and the particular pointing device, making use of the (optional) unique identifier chip in the pointing device 100. Control is transferred to step 1427. At step 1427, if a login is required, then step 1428 is executed. At step 1428, the user logs in, either by using the keyboard, or by a unique motion with the pointing device 100 (such as writing the user's signature). Control is transferred to step 1429.
At step 1429, the system checks whether there are user-specific coefficients available on the system for translating this specific user's style of motion, rotation and pressure from raw sensor data to idealized measurement data. If data are available, then step 1430 is executed. At step 1430, the default coefficients are replaced by the previously determined user-specific coefficients. Control is transferred to step 1431. At step 1431, the user login is translated to rotation and orientation coefficients. The system returns to the wait state 1419
At step 1432, the user orients the device. At step 1433, a rotation calibration is initiated by asking the user to orient the device in one or more predetermined positions. At step 1434, a stream of data points are recorded. At step 1435, conforming X, Y and Z rotations are constructed. At step 1436, a determination is made whether sufficient data have been collected to conform the rotation measurements to the actual rotations. If not, then control is returned to step 1433. Otherwise, the system returns to the wait state 1419.
At step 1437, the user squeezes the pointing device 100. At step 1438, the driver module 904 updates the idealized pressure Pi, and translates the same to the normalized pressure P. At step 1439, the pressure is provided to the application, which processes the pressure. At step 1440, if the application is finished with processing the squeeze operation, then control returns to step 1419 (the wait state). If not, then step 1441 is executed, to determine whether driver module 904 detects a lull or pause in the squeezing. If there is a pause, then control is transferred to step 1419 (wait state). Otherwise, step 1438 is executed again.
At step 1442, the user moves the pointing device 100. At step 1443, the driver module 904 updates the idealized motion vector Mxi, Myi, Mzi, and translates the same into the normalized motion vector Mx, My, Mz. At step 1444, the application program processes the normalized motion vector Mx, My, Mz. At step 1445, if the application is done with processing the movement of the pointing device 100, then control transfers to step 1419 (wait state). Otherwise, step 1446 is executed, to determine whether the driver detects a lull in the motion. If there is a pause, then control is transferred to step 1419 (wait state). Otherwise, step 1443 is executed again.
At step 1447, the user releases the pointing device 100. At step 1448, the driver module 904 enters a sleep state. At step 1449, the application is notified that the pointing device has entered the sleep state.
Listed below is a set of pseudo-code that may be used to construct a software system suitable for handling input signals from the exemplary pointing device 100.
Definition of Terms
A, B, C, D, E, F, G sensor strip oriented along a sensing plane
Au, Bu,... unweighted portion of sensor
Aw, Bw,... weighted portion of sensor
Mxi, Myi, Mzi idealized (un-normalized) X, Y, Z value
Mx, My, Mz, Mp normalized X, Y, Z values used by application
Ox, Oy, Oz orientation vector to translate idealized to normalized XN,Z values
Pi idealized pressure value
P normalized pressure value
Sm translate sensor outputs to idealized motion
Sr translate sensor outputs to idealized rotational motion
Sp translate sensor outputs to idealize pressure
Tm translate idealized motion to application motion Tp translate idealized pressure to application pressure Tr translate idealized rotational motion to application rotational motion
Accelerometer ABC with pressure D
Sm: A maps to Mxi B maps to Mzi C maps to Mzi
Sp: D maps to Pi
Tm: (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz) Tp: Pi maps to P Origami sensor with ABC flaps with each flap having an un-weighted (u) and weighted (w) section
Sp: (Au, Bu, Cu) maps to (Pi)
Sm: (Aw, Bw, Cw, Pi) maps to (Mxi, Myi, Mzi)
Tm: (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz)
Tp: Pi maps to P
Tetrahedral sensor ABCDEF with six strips Sp: (ABCDEF) maps to Pi
Sm: (ABCDEF, Rxi, Ryi, Rzi) maps to Mxi (ABCDEF, Rxi, Ryi, Rzi) maps to Myi (ABCDEF, Rxi, Ryi, Rzi) maps to Mzi
Sr: (ABCDEF, Mxi, Myi, Mzi) maps to Rxi (ABCDEF, Mxi, Myi, Mzi) maps to Ryi (ABCDEF, Mxi, Myi, Mzi) maps to Rzi
Tm: (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz) Tr: (Rxi, Ryi, Rzi) rotationally maps to (Rx, Ry, Rz) Tp: Pi maps to P
Calibration ask user to center device user moves device and wait for activity and the lull or press
ask user to trace shape on XY axis user moves device while software collects ABCD points translate ABCD data points to Ox, Oy sensor coefficients
ask user to move in and out on Z axis user moves device in and out while software collects ABCD points translate ABCD points to Oz sensor coefficients
Registration after calibration ask user to create login signature determine cultural orientation left-right vs. right-left, etc. accept user's input while software collects ABCD points translate input to Ox, Oy, Oz over time
Login ask user to reproduce login signature set orientation based upon cultural left-right vs. right-left context replaces manual orientation to draw shape on X-Y axis
Orientation if manual orientation ask user to move input device to center of area with cord facing direction (floor) and front face facing direction set Ox, Oy, Oz to (0,0,0) else while waiting for input device to stop or pressure state to signal squeeze state save last few Mxi, Myi, Mzi states
Although the exemplary application program described above is a virtual puppet program, there are many uses for a pointing device 100 according to the present invention. For example, the output of the pointing device may be mapped to the parameters of a visual synthesizer, such as the shifting of a color palette, the drawing or motion of an object. Alternatively, the output of device 100 may be mapped to parameters of a sound synthesizer, such as pitch, timbre, amplitude, or placement in three-dimensional sonic space.
Although the exemplary virtual puppet application is configured so that one pointing device 100 controls a single virtual puppet 710, the software can be configured to operate more than one virtual puppet (or other object) using a single input pointing device 100. For example, assuming that the pointing device includes sensors for measuring x, y, and z accelerations and pressure P, the first puppet 710 may be controlled by the x and y accelerations, and the second puppet 711 may be controlled by z and P. The first and second objects need not be identical objects, or even objects of the same type.
For example, the first object may be a virtual puppet, and the second object may be the background scenery or sound. By x and y movements, the puppet 710 is controlled, and by z and P, the user can vary the background from day to night, or from silence to the sound of thunder.
To use two input devices 100, software applications can us the Mouse API for selecting or navigating or use a Driver API. For example, the driver for Digitizer (or Graphics) tablets may allow more than one pen to be used at the same time. Emulating a mouse can be accomplished by installing a driver and going beyond a mouse can be accomplished by modifying the application program. One can also overlay other input devices, such as a graphics tablet, with yet another driver.
Further, the output signals from the exemplary pointing device 100 may be mapped to those of conventional input devices in an emulation mode. For example, the outputs may be mapped to the x and y positions of a conventional mouse and the left or right click signals from depressing the mouse buttons. Mapping the x, y and z (or x, y and P) parameters into x and y parameters only may require the computer to determine the orientation of the pointing device 100, such as moving on a predominantly x-y plane, a y-z plane, or the z-x plane, or combinations of all three. Alternatively, individual user profiles may be used to determine the current orientation.
Although the exemplary use for two pointing devices is to control two different virtual puppets, a second pointing device may be used to expand the number of parameters for an application. For example, in an audiovisual application, one pointing device 100 may control various aspects of the video signal, while another pointing device may be used to control the audio signals.
Although the exemplary pointing device is sensitive to motion and pressure, additional sensors may also be included. For example, a microphone may be incorporated into the pointing device, for operation of voice activated software, or for storage of sounds.
Although the exemplary embodiment described above has the virtual puppet application program running locally in the computer 701, the application program may be executed in other processors located remotely, either as a sole process or in parallel with other instances of the application program on other processors. The local computer can communicate with other computers via a telecommunication network, which may be a LAN, WAN, Internet, or wireless protocols.
The present invention may be embodied in the form of computer-implemented processes and apparatus for practicing those processes. The present invention may also be embodied in the form of computer program code embodied in tangible media, such as floppy diskettes, read only memories (ROMs), CD-ROMs, hard drives, "ZLP™ drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention. The present invention may also be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over the electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention. When implemented on a general -purpose processor, the computer program code segments configure the processor to create specific logic circuits.
Although the invention has been described in terms of exemplary embodiments, it is not limited thereto. Rather, the appended claim should be construed broadly, to include other variants and embodiments of the invention that may be made by those skilled in the art without departing from the scope and range of equivalents of the invention.

Claims

What is claimed is:
1. A hand-held computer input pointing device, comprising: at least one motion detector, the at least one motion detector being capable of detecting motion in at least three dimensions; and at least one pressure sensor capable of sensing pressure quantitatively, wherein the input device is operable within a hand of a user to transmit signals from the motion detector and the pressure sensor, without contacting a base, work surface or pad.
2. The pointing device of claim 1 , wherein the at least one motion detector includes at least one of the group consisting of an accelerometer and a piezoelectric device.
3. The pointing device of claim 2, wherein the at least one motion detector includes at least three piezoelectric strips oriented in three orthogonal directions.
4. The pointing device of claim 3, wherein each of the three piezoelectric strips has a respective weight thereon to translate motion to a voltage.
5. The pointing device of claim 3, wherein the piezoelectric device is an origami sensor having cut coils, the origami sensor having either a cubic or tetrahedral shape.
6. The pointing device of claim 1, wherein the motion sensor includes a tetrahedral shaped device having six edge sensors.
7. The pointing device of claim 1, wherein the motion detector and pressure sensor are contained within a squeezable housing.
8. The pointing device of claim 7, wherein the housing is an elastomeric ball.
9. The pointing device of claim 1, wherein the motion sensor is capable of sensing pitch, yaw and roll.
10. The pointing device of claim 1, wherein the motion sensor is capable of sensing x- acceleration, y-acceleration and z-acceleration.
11. The pointing device of claim 1, further comprising a memory device that stores and outputs an identifier that is unique to the computer input pointing device.
12. A computer system comprising: a computer having a processor, a memory, an output device, and an input port; and a hand-held computer input pointing device coupled to the input port of the computer, the pointing device comprising: at least one motion detector, the at least one motion detector being capable of detecting motion in at least three dimensions; and at least one pressure sensor capable of sensing pressure quantitatively, wherein the input device is operable within a hand of a user to transmit signals from the motion detector and the pressure sensor, without contacting a base, work surface or pad.
13. The computer system of claim 12, wherein: the output device is a display; the computer further comprises a storage device, and one of the group consisting of the memory and the storage device has stored therein computer program code for causing the computer to show an object on the display, the object having a plurality of portions that are selectively movable in response to motion or squeezing of the pointing device.
14. The computer system of claim 13, wherein the object is an animated character controlled by input parameters.
15. The computer system of claim 14, further comprising a hub and at least a second computer input pointing device capable of sensing motion in at least three dimensions and a pressure sensor, the first and second computer input pointing devices being connected to the USB hub, wherein each of the first and second computer input pointing devices controls a respecti-v© animated character.
16. The computer system of claim 14, wherein the computer program code includes a range of motion tree for the animated character, the range of motion tree restricts a range of possible motion of the animated character about a current position of the animated character.
17. The computer system of claim 16, wherein the computer program code includes a respective unique range of motion tree for each of a plurality of users
18 The computer system of claim 16, wherein the computer program code includes a module for using output signals from the computer input pointing device to emulate output signals from a mouse
19 The pointing device of claim 4, wherein each of the three piezoelectric strips has at least two sections, including a weighted section and an unweighted section
20 The pointing device of claim 5, wherein the cut coils on each face of the motion detector are attached to a single weight at a center of the motion detector
21 The pointing device of claim 1, further compπsing a multiplexer that receives output signals from the motion detector and the pressure sensor, for outputtmg a multiplexed signal for transmission to a computer, wherein the motion detector, pressure sensor and multiplexer are contained within a squeezable housing.
22. The pointing device of claim 7, wherein the squeezable housing has at least one tactile feature to provide a tactile cue for orienting the pointing device.
23. The pointing device of claim 7, wherein the squeezable housing has at least one visible feature to provide a visual cue for oπenting the pointing device.
24. A method for operating a computer having an output device, compπsing the steps of: receiving a plurality of signals from a computer input pointing device, the signals representing quantifiable pressure and motion in at least three dimensions; and outputtmg a video, audio or tactile output signal to the output device, the output signal having at least two charactenstics that are capable of being vaπed separately from each other.
25. The method of claim 24, wherein: the output device is a display; the step of outputtmg includes displaying an animated character on the display, and moving at least two portions the animated character separately from each other.
26. A method for operating a computer input pointing device, comprising the steps of: moving the pointing device; transmitting at least three motion signals from the pointing device to a computer, the motion signals including signals representing translations in one or more dimension, or rotations in one or more dimensions, or a combination of translations and rotations; squeezing at least one portion of the pointing device; and transmitting quantifiable pressure signals from the pointing device to the computer.
27. The method of claim 26, further comprising the steps of: speaking into a microphone contained within the pointing device; and transmitting an audio signal from the microphone to the computer.
PCT/US2000/010579 1999-04-20 2000-04-20 Human gestural input device with motion and pressure WO2000063874A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU44730/00A AU4473000A (en) 1999-04-20 2000-04-20 Human gestural input device with motion and pressure
EP00926157A EP1250698A2 (en) 1999-04-20 2000-04-20 Human gestural input device with motion and pressure

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13019199P 1999-04-20 1999-04-20
US60/130,191 1999-04-20

Publications (2)

Publication Number Publication Date
WO2000063874A1 true WO2000063874A1 (en) 2000-10-26
WO2000063874A9 WO2000063874A9 (en) 2002-04-25

Family

ID=22443485

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/010579 WO2000063874A1 (en) 1999-04-20 2000-04-20 Human gestural input device with motion and pressure

Country Status (3)

Country Link
EP (1) EP1250698A2 (en)
AU (1) AU4473000A (en)
WO (1) WO2000063874A1 (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1104143A2 (en) 1999-11-29 2001-05-30 Nokia Mobile Phones Ltd. Handheld devices
WO2003065297A2 (en) * 2002-01-28 2003-08-07 Siemens Aktiengesellschaft Device for inputting information
WO2005015377A2 (en) * 2003-07-24 2005-02-17 Adentis Method and system for gestural control of an apparatus
EP1757345A1 (en) * 2005-08-24 2007-02-28 Nintendo Co., Limited Video game controller and video game system
EP1759745A1 (en) * 2005-08-30 2007-03-07 Nintendo Co., Ltd. Videogame system and storage medium having videogame program stored thereon
WO2007030947A1 (en) * 2005-09-16 2007-03-22 Anthony Szturm Mapping motion sensors to standard input devices
WO2007077124A1 (en) * 2005-12-31 2007-07-12 Ball-It Oy User operable pointing device such as mouse
WO2008081188A2 (en) * 2007-01-05 2008-07-10 Ali Kord A display system
EP2118840A1 (en) * 2007-03-01 2009-11-18 Sony Computer Entertainment America, Inc. Interactive user controlled avatar animations
US7634731B2 (en) * 2002-11-20 2009-12-15 Lg Electronics Inc. System and method for remotely controlling character avatar image using mobile phone
WO2010018485A1 (en) * 2008-08-12 2010-02-18 Koninklijke Philips Electronics N.V. Motion detection system
WO2010019240A1 (en) * 2008-08-15 2010-02-18 Apple Inc. Hybrid inertial and touch sensing input device
US7800044B1 (en) 2007-11-09 2010-09-21 Dp Technologies, Inc. High ambient motion environment detection eliminate accidental activation of a device
EP1852163A3 (en) * 2006-05-01 2011-05-25 Nintendo Co., Ltd. Game program and game apparatus
WO2012079948A1 (en) * 2010-12-16 2012-06-21 International Business Machines Corporation A human interface device with two three-axis-accelerometers
FR2972819A1 (en) * 2011-03-15 2012-09-21 France Telecom Device for capturing data representative of reaction of e.g. user to stress and/or tiredness state in candidate object, has data processing device, where data including given deformation of material is transmitted to processing device
US8456419B2 (en) 2002-02-07 2013-06-04 Microsoft Corporation Determining a position of a pointing device
US8819596B2 (en) 2004-02-06 2014-08-26 Nokia Corporation Gesture control system
WO2014162170A1 (en) 2013-04-02 2014-10-09 Nokia Corporation An apparatus
US8872646B2 (en) 2008-10-08 2014-10-28 Dp Technologies, Inc. Method and system for waking up a device due to motion
US8870655B2 (en) 2005-08-24 2014-10-28 Nintendo Co., Ltd. Wireless game controllers
CN104142823A (en) * 2014-06-30 2014-11-12 腾讯科技(深圳)有限公司 Input device and control system
US8888576B2 (en) 1999-02-26 2014-11-18 Mq Gaming, Llc Multi-media interactive play system
US8913011B2 (en) 2001-02-22 2014-12-16 Creative Kingdoms, Llc Wireless entertainment device, system, and method
US9011248B2 (en) 2005-08-22 2015-04-21 Nintendo Co., Ltd. Game operating device
US9039533B2 (en) 2003-03-25 2015-05-26 Creative Kingdoms, Llc Wireless interactive game having both physical and virtual elements
US9149717B2 (en) 2000-02-22 2015-10-06 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US9183044B2 (en) 2007-07-27 2015-11-10 Dp Technologies, Inc. Optimizing preemptive operating system with motion sensing
US9272206B2 (en) 2002-04-05 2016-03-01 Mq Gaming, Llc System and method for playing an interactive game
USRE45905E1 (en) 2005-09-15 2016-03-01 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US9390229B1 (en) 2006-04-26 2016-07-12 Dp Technologies, Inc. Method and apparatus for a health phone
US9446319B2 (en) 2003-03-25 2016-09-20 Mq Gaming, Llc Interactive gaming toy
US9480929B2 (en) 2000-10-20 2016-11-01 Mq Gaming, Llc Toy incorporating RFID tag
US9495015B1 (en) 2006-07-11 2016-11-15 Dp Technologies, Inc. Method and apparatus for utilizing motion user interface to determine command availability
US9616334B2 (en) 2002-04-05 2017-04-11 Mq Gaming, Llc Multi-platform gaming system using RFID-tagged toys
CN106595697A (en) * 2016-12-15 2017-04-26 宁夏农垦贺兰山奶业有限公司 Piezoelectric pedometer and step counting method
US9797920B2 (en) 2008-06-24 2017-10-24 DPTechnologies, Inc. Program setting adjustments based on activity identification
WO2019239376A1 (en) * 2018-06-13 2019-12-19 Koofy Innovation Limited Input surface system
US10744390B1 (en) 2007-02-08 2020-08-18 Dp Technologies, Inc. Human activity monitoring device with activity identification
FR3095957A1 (en) * 2019-05-15 2020-11-20 The Cube Company Electronic device for interactive control of a software application installed on a communication terminal
US20230338827A1 (en) * 2022-04-25 2023-10-26 Sony Interactive Entertainment Inc. Correlating gestures on deformable controller to computer simulation input signals

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2879436B1 (en) 2004-12-22 2007-03-09 Ldr Medical INTERVERTEBRAL DISC PROSTHESIS
US9529437B2 (en) 2009-05-26 2016-12-27 Dp Technologies, Inc. Method and apparatus for a motion state aware device
WO2019073490A1 (en) * 2017-10-12 2019-04-18 Glenn Fernandes 3d mouse and ultrafast keyboard

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5703623A (en) * 1996-01-24 1997-12-30 Hall; Malcolm G. Smart orientation sensing circuit for remote control
US5757360A (en) * 1995-05-03 1998-05-26 Mitsubishi Electric Information Technology Center America, Inc. Hand held computer control device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5047952A (en) * 1988-10-14 1991-09-10 The Board Of Trustee Of The Leland Stanford Junior University Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove
US5923318A (en) * 1996-04-12 1999-07-13 Zhai; Shumin Finger manipulatable 6 degree-of-freedom input device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5757360A (en) * 1995-05-03 1998-05-26 Mitsubishi Electric Information Technology Center America, Inc. Hand held computer control device
US5703623A (en) * 1996-01-24 1997-12-30 Hall; Malcolm G. Smart orientation sensing circuit for remote control

Cited By (120)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9186585B2 (en) 1999-02-26 2015-11-17 Mq Gaming, Llc Multi-platform gaming systems and methods
US9861887B1 (en) 1999-02-26 2018-01-09 Mq Gaming, Llc Multi-platform gaming systems and methods
US10300374B2 (en) 1999-02-26 2019-05-28 Mq Gaming, Llc Multi-platform gaming systems and methods
US9731194B2 (en) 1999-02-26 2017-08-15 Mq Gaming, Llc Multi-platform gaming systems and methods
US8888576B2 (en) 1999-02-26 2014-11-18 Mq Gaming, Llc Multi-media interactive play system
US9468854B2 (en) 1999-02-26 2016-10-18 Mq Gaming, Llc Multi-platform gaming systems and methods
EP1104143A3 (en) * 1999-11-29 2004-09-01 Nokia Corporation Handheld devices
EP1104143A2 (en) 1999-11-29 2001-05-30 Nokia Mobile Phones Ltd. Handheld devices
US9713766B2 (en) 2000-02-22 2017-07-25 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US9474962B2 (en) 2000-02-22 2016-10-25 Mq Gaming, Llc Interactive entertainment system
US9814973B2 (en) 2000-02-22 2017-11-14 Mq Gaming, Llc Interactive entertainment system
US9149717B2 (en) 2000-02-22 2015-10-06 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US10188953B2 (en) 2000-02-22 2019-01-29 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US10307671B2 (en) 2000-02-22 2019-06-04 Mq Gaming, Llc Interactive entertainment system
US9579568B2 (en) 2000-02-22 2017-02-28 Mq Gaming, Llc Dual-range wireless interactive entertainment device
US8915785B2 (en) 2000-02-22 2014-12-23 Creative Kingdoms, Llc Interactive entertainment system
US9480929B2 (en) 2000-10-20 2016-11-01 Mq Gaming, Llc Toy incorporating RFID tag
US9931578B2 (en) 2000-10-20 2018-04-03 Mq Gaming, Llc Toy incorporating RFID tag
US10307683B2 (en) 2000-10-20 2019-06-04 Mq Gaming, Llc Toy incorporating RFID tag
US8913011B2 (en) 2001-02-22 2014-12-16 Creative Kingdoms, Llc Wireless entertainment device, system, and method
US9737797B2 (en) 2001-02-22 2017-08-22 Mq Gaming, Llc Wireless entertainment device, system, and method
US9162148B2 (en) 2001-02-22 2015-10-20 Mq Gaming, Llc Wireless entertainment device, system, and method
US9393491B2 (en) 2001-02-22 2016-07-19 Mq Gaming, Llc Wireless entertainment device, system, and method
US10179283B2 (en) 2001-02-22 2019-01-15 Mq Gaming, Llc Wireless entertainment device, system, and method
US10758818B2 (en) 2001-02-22 2020-09-01 Mq Gaming, Llc Wireless entertainment device, system, and method
WO2003065297A2 (en) * 2002-01-28 2003-08-07 Siemens Aktiengesellschaft Device for inputting information
WO2003065297A3 (en) * 2002-01-28 2004-08-19 Siemens Ag Device for inputting information
US10488950B2 (en) 2002-02-07 2019-11-26 Microsoft Technology Licensing, Llc Manipulating an object utilizing a pointing device
US10331228B2 (en) 2002-02-07 2019-06-25 Microsoft Technology Licensing, Llc System and method for determining 3D orientation of a pointing device
US8456419B2 (en) 2002-02-07 2013-06-04 Microsoft Corporation Determining a position of a pointing device
US9454244B2 (en) 2002-02-07 2016-09-27 Microsoft Technology Licensing, Llc Recognizing a movement of a pointing device
US8707216B2 (en) 2002-02-07 2014-04-22 Microsoft Corporation Controlling objects via gesturing
US10507387B2 (en) 2002-04-05 2019-12-17 Mq Gaming, Llc System and method for playing an interactive game
US9616334B2 (en) 2002-04-05 2017-04-11 Mq Gaming, Llc Multi-platform gaming system using RFID-tagged toys
US10010790B2 (en) 2002-04-05 2018-07-03 Mq Gaming, Llc System and method for playing an interactive game
US9272206B2 (en) 2002-04-05 2016-03-01 Mq Gaming, Llc System and method for playing an interactive game
US9463380B2 (en) 2002-04-05 2016-10-11 Mq Gaming, Llc System and method for playing an interactive game
US11278796B2 (en) 2002-04-05 2022-03-22 Mq Gaming, Llc Methods and systems for providing personalized interactive entertainment
US10478719B2 (en) 2002-04-05 2019-11-19 Mq Gaming, Llc Methods and systems for providing personalized interactive entertainment
US7634731B2 (en) * 2002-11-20 2009-12-15 Lg Electronics Inc. System and method for remotely controlling character avatar image using mobile phone
US10583357B2 (en) 2003-03-25 2020-03-10 Mq Gaming, Llc Interactive gaming toy
US9446319B2 (en) 2003-03-25 2016-09-20 Mq Gaming, Llc Interactive gaming toy
US9770652B2 (en) 2003-03-25 2017-09-26 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US8961312B2 (en) 2003-03-25 2015-02-24 Creative Kingdoms, Llc Motion-sensitive controller and associated gaming applications
US9993724B2 (en) 2003-03-25 2018-06-12 Mq Gaming, Llc Interactive gaming toy
US9039533B2 (en) 2003-03-25 2015-05-26 Creative Kingdoms, Llc Wireless interactive game having both physical and virtual elements
US11052309B2 (en) 2003-03-25 2021-07-06 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US9707478B2 (en) 2003-03-25 2017-07-18 Mq Gaming, Llc Motion-sensitive controller and associated gaming applications
US9393500B2 (en) 2003-03-25 2016-07-19 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US10022624B2 (en) 2003-03-25 2018-07-17 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
US10369463B2 (en) 2003-03-25 2019-08-06 Mq Gaming, Llc Wireless interactive game having both physical and virtual elements
CN100437449C (en) * 2003-07-24 2008-11-26 阿登蒂斯公司 Method and system for gestural control of an apparatus
WO2005015377A2 (en) * 2003-07-24 2005-02-17 Adentis Method and system for gestural control of an apparatus
WO2005015377A3 (en) * 2003-07-24 2005-06-30 Adentis Method and system for gestural control of an apparatus
US8819596B2 (en) 2004-02-06 2014-08-26 Nokia Corporation Gesture control system
US9675878B2 (en) 2004-09-29 2017-06-13 Mq Gaming, Llc System and method for playing a virtual game by sensing physical movements
US9011248B2 (en) 2005-08-22 2015-04-21 Nintendo Co., Ltd. Game operating device
US9498728B2 (en) 2005-08-22 2016-11-22 Nintendo Co., Ltd. Game operating device
US10155170B2 (en) 2005-08-22 2018-12-18 Nintendo Co., Ltd. Game operating device with holding portion detachably holding an electronic device
US9700806B2 (en) 2005-08-22 2017-07-11 Nintendo Co., Ltd. Game operating device
US10238978B2 (en) 2005-08-22 2019-03-26 Nintendo Co., Ltd. Game operating device
US10661183B2 (en) 2005-08-22 2020-05-26 Nintendo Co., Ltd. Game operating device
US9498709B2 (en) 2005-08-24 2016-11-22 Nintendo Co., Ltd. Game controller and game system
US9227138B2 (en) 2005-08-24 2016-01-05 Nintendo Co., Ltd. Game controller and game system
US8834271B2 (en) 2005-08-24 2014-09-16 Nintendo Co., Ltd. Game controller and game system
US11027190B2 (en) 2005-08-24 2021-06-08 Nintendo Co., Ltd. Game controller and game system
US8870655B2 (en) 2005-08-24 2014-10-28 Nintendo Co., Ltd. Wireless game controllers
US10137365B2 (en) 2005-08-24 2018-11-27 Nintendo Co., Ltd. Game controller and game system
EP2292305A1 (en) * 2005-08-24 2011-03-09 Nintendo Co., Ltd. Video game controller and video game system
US9044671B2 (en) 2005-08-24 2015-06-02 Nintendo Co., Ltd. Game controller and game system
EP1757345A1 (en) * 2005-08-24 2007-02-28 Nintendo Co., Limited Video game controller and video game system
EP1759745A1 (en) * 2005-08-30 2007-03-07 Nintendo Co., Ltd. Videogame system and storage medium having videogame program stored thereon
USRE45905E1 (en) 2005-09-15 2016-03-01 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
WO2007030947A1 (en) * 2005-09-16 2007-03-22 Anthony Szturm Mapping motion sensors to standard input devices
WO2007077124A1 (en) * 2005-12-31 2007-07-12 Ball-It Oy User operable pointing device such as mouse
US8441434B2 (en) 2005-12-31 2013-05-14 Ball-It Oy User operable pointing device such as mouse
US8727879B2 (en) 2006-01-05 2014-05-20 Nintendo Co., Ltd. Video game with dual motion controllers
US9390229B1 (en) 2006-04-26 2016-07-12 Dp Technologies, Inc. Method and apparatus for a health phone
US8535155B2 (en) 2006-05-01 2013-09-17 Nintendo Co., Ltd. Video game using dual motion sensing controllers
EP1852163A3 (en) * 2006-05-01 2011-05-25 Nintendo Co., Ltd. Game program and game apparatus
EP2481453A1 (en) * 2006-05-01 2012-08-01 Nintendo Co., Ltd. Game program and game apparatus
US10065108B2 (en) 2006-05-01 2018-09-04 Nintendo Co., Ltd. Video game using dual motion sensing controllers
US9278280B2 (en) 2006-05-01 2016-03-08 Nintendo Co., Ltd. Video game using dual motion sensing controllers
US9733702B2 (en) 2006-05-01 2017-08-15 Nintendo Co., Ltd. Video game using dual motion sensing controllers
US9495015B1 (en) 2006-07-11 2016-11-15 Dp Technologies, Inc. Method and apparatus for utilizing motion user interface to determine command availability
WO2008081188A3 (en) * 2007-01-05 2008-11-06 Ali Kord A display system
WO2008081188A2 (en) * 2007-01-05 2008-07-10 Ali Kord A display system
US10744390B1 (en) 2007-02-08 2020-08-18 Dp Technologies, Inc. Human activity monitoring device with activity identification
EP2118840A4 (en) * 2007-03-01 2010-11-10 Sony Comp Entertainment Us Interactive user controlled avatar animations
EP2118840A1 (en) * 2007-03-01 2009-11-18 Sony Computer Entertainment America, Inc. Interactive user controlled avatar animations
US9940161B1 (en) 2007-07-27 2018-04-10 Dp Technologies, Inc. Optimizing preemptive operating system with motion sensing
US10754683B1 (en) 2007-07-27 2020-08-25 Dp Technologies, Inc. Optimizing preemptive operating system with motion sensing
US9183044B2 (en) 2007-07-27 2015-11-10 Dp Technologies, Inc. Optimizing preemptive operating system with motion sensing
US7800044B1 (en) 2007-11-09 2010-09-21 Dp Technologies, Inc. High ambient motion environment detection eliminate accidental activation of a device
US9797920B2 (en) 2008-06-24 2017-10-24 DPTechnologies, Inc. Program setting adjustments based on activity identification
US11249104B2 (en) 2008-06-24 2022-02-15 Huawei Technologies Co., Ltd. Program setting adjustments based on activity identification
CN102131551A (en) * 2008-08-12 2011-07-20 皇家飞利浦电子股份有限公司 Motion detection system
KR101625360B1 (en) 2008-08-12 2016-05-30 코닌클리케 필립스 엔.브이. Motion detection system
WO2010018485A1 (en) * 2008-08-12 2010-02-18 Koninklijke Philips Electronics N.V. Motion detection system
RU2509587C2 (en) * 2008-08-12 2014-03-20 Конинклейке Филипс Электроникс Н.В. Motion detection system
US9358425B2 (en) 2008-08-12 2016-06-07 Koninklijke Philips N.V. Motion detection system
US20110140931A1 (en) * 2008-08-12 2011-06-16 Koninklijke Philips Electronics N.V. Motion detection system
JP2011530756A (en) * 2008-08-12 2011-12-22 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Motion detection system
US8300020B2 (en) 2008-08-15 2012-10-30 Apple Inc. Hybrid inertial and touch sensing input device
WO2010019240A1 (en) * 2008-08-15 2010-02-18 Apple Inc. Hybrid inertial and touch sensing input device
US8872646B2 (en) 2008-10-08 2014-10-28 Dp Technologies, Inc. Method and system for waking up a device due to motion
US10031594B2 (en) 2010-12-16 2018-07-24 International Business Machines Corporation Sphere-like input device
US10031593B2 (en) 2010-12-16 2018-07-24 International Business Machines Corporation Sphere-like input device
WO2012079948A1 (en) * 2010-12-16 2012-06-21 International Business Machines Corporation A human interface device with two three-axis-accelerometers
FR2972819A1 (en) * 2011-03-15 2012-09-21 France Telecom Device for capturing data representative of reaction of e.g. user to stress and/or tiredness state in candidate object, has data processing device, where data including given deformation of material is transmitted to processing device
EP2981876A4 (en) * 2013-04-02 2016-12-07 Nokia Technologies Oy An apparatus
WO2014162170A1 (en) 2013-04-02 2014-10-09 Nokia Corporation An apparatus
US10248205B2 (en) 2013-04-02 2019-04-02 Nokia Technologies Oy Apparatus for recording audio and vibration content of event
CN104142823A (en) * 2014-06-30 2014-11-12 腾讯科技(深圳)有限公司 Input device and control system
US10061400B2 (en) 2014-06-30 2018-08-28 Tencent Technology (Shenzhen) Company Limited Input device and control system
CN106595697A (en) * 2016-12-15 2017-04-26 宁夏农垦贺兰山奶业有限公司 Piezoelectric pedometer and step counting method
CN106595697B (en) * 2016-12-15 2023-07-11 宁夏农垦贺兰山奶业有限公司 Piezoelectric pedometer and step counting method
WO2019239376A1 (en) * 2018-06-13 2019-12-19 Koofy Innovation Limited Input surface system
FR3095957A1 (en) * 2019-05-15 2020-11-20 The Cube Company Electronic device for interactive control of a software application installed on a communication terminal
US20230338827A1 (en) * 2022-04-25 2023-10-26 Sony Interactive Entertainment Inc. Correlating gestures on deformable controller to computer simulation input signals

Also Published As

Publication number Publication date
EP1250698A4 (en) 2002-10-23
EP1250698A2 (en) 2002-10-23
AU4473000A (en) 2000-11-02

Similar Documents

Publication Publication Date Title
WO2000063874A1 (en) Human gestural input device with motion and pressure
WO2000063874A9 (en) Human gestural input device with motion and pressure
CN111356968A (en) Rendering virtual hand gestures based on detected hand input
US8295549B2 (en) Peripheral device having light emitting objects for interfacing with a computer gaming system claim of priority
US20200050342A1 (en) Pervasive 3D Graphical User Interface
KR100742029B1 (en) Hand-held computer interactive device
KR101576979B1 (en) Electric apparatus which determines user input using magnetic field sensor
JP4065035B2 (en) 3D cursor position setting device
Taylor et al. Graspables: grasp-recognition as a user interface
US20160328028A1 (en) System, method and device for foot-operated motion and movement control in virtual reality and simulated environments
JP6737996B2 (en) Handheld controller for computer, control system for computer and computer system
GB2247066A (en) Manual controller for computer graphic object display with six degrees of freedom
WO2002037466A1 (en) Electronic user worn interface device
Mulder Design of virtual three-dimensional instruments for sound control
CN109559720A (en) Electronic musical instrument and control method
US20040041828A1 (en) Adaptive non-contact computer user-interface system and method
WO2009008872A1 (en) Multi-dimensional input device with center push button
US11947399B2 (en) Determining tap locations on a handheld electronic device based on inertial measurements
US20230041294A1 (en) Augmented reality (ar) pen/hand tracking
Parker Buttons, simplicity, and natural interfaces
Schmidt et al. Sensor virrig-a balance cushion as controller
Calella et al. HandMagic: Towards user interaction with inertial measuring units
Visi et al. Motion controllers, sound, and music in video games: state of the art and research perspectives
Hashimoto et al. A grasping device to sense hand gesture for expressive sound generation
Wang et al. The Pinch Sensor: An Input Device for In-Hand Manipulation with the Index Finger and Thumb

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2000926157

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWW Wipo information: withdrawn in national office

Ref document number: 2000926157

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2000926157

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP