EP1250698A4 - Vorrichtung zur eingabe von menschlichen gesten mittels bewegung und druck - Google Patents

Vorrichtung zur eingabe von menschlichen gesten mittels bewegung und druck

Info

Publication number
EP1250698A4
EP1250698A4 EP00926157A EP00926157A EP1250698A4 EP 1250698 A4 EP1250698 A4 EP 1250698A4 EP 00926157 A EP00926157 A EP 00926157A EP 00926157 A EP00926157 A EP 00926157A EP 1250698 A4 EP1250698 A4 EP 1250698A4
Authority
EP
European Patent Office
Prior art keywords
pointing device
motion
computer
sensor
pressure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP00926157A
Other languages
English (en)
French (fr)
Other versions
EP1250698A2 (de
Inventor
John Warren Stringer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of EP1250698A2 publication Critical patent/EP1250698A2/de
Publication of EP1250698A4 publication Critical patent/EP1250698A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10NELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10N30/00Piezoelectric or electrostrictive devices
    • H10N30/30Piezoelectric or electrostrictive devices with mechanical input and electrical output, e.g. functioning as generators or sensors
    • H10N30/302Sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0382Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC

Definitions

  • the present invention relates the field of input devices for computers.
  • Computer input pointing devices are well known.
  • a typical computer input pointing device contains at least two sensors for sensing motion in at least two directions, such as x and y (forward-backward and left-right).
  • the pointing device also has at least one actuator for causing the pointing device to transmit a command signal (typically referred to as a "click") to a computer.”
  • a command signal typically referred to as a "click”
  • the most common pointing device for a desktop computer or workstation is a mouse.
  • the mouse may have a ball on its underside. Motion of the ball causes rotation of one or more of a plurality of rotation sensors adjacent to the ball. The number of rotations determine the magnitude of the motion in the x or y direction.
  • the most common mouse type includes two buttons for sending commands known as "left-click" and "right-click.
  • Other types of mice include optical sensors that count markings on a specially marked mouse pad to determine an amount of movement.
  • the joystick has a lever that is tilted in a forward-backward or left-right direction, each of which is sensed independently.
  • a button is typically provided on the end of the lever for transmitting a command signal.
  • Some joysticks also allow rotation of the lever, which is also sensed.
  • Alternative pointing devices are typically used on laptop computers.
  • An early pointing device ⁇ s ⁇ d on laptop computers is the track ball.
  • the track ball functions like an upside-down mouse. Instead of moving the device, the ball is rotated directly.
  • Many laptop computers include a miniature joystick that is positioned on the home row of the keyboard.
  • Another common pointing device in laptop computers is a touch pad.
  • the touch pad is a rectangular device that is sensitive to touch. Left and right sliding motion of a finger on the touch pad is detected. The touch pad also senses when it is struck, and produces a "click" signal.
  • conventional pointing devices are suitable for locating a point on an x-y grid, and transmitting a simple single-valued command signal, conventional pointing devices leave much to be desired for controlling application programs that require more complex inputs.
  • One aspect of the present invention is a hand-held computer input pointing device, comprising at least one motion detector.
  • the at least one motion detector is capable of detecting motion in at least three dimensions.
  • At least one pressure sensor is capable of sensing pressure quantitatively.
  • the input device is operable within a hand of a user to transmit signals from the motion detector and the pressure sensor, without contacting a base, work surface or pad.
  • Another aspect of the invention is a method for operating a computer having an output device, comprising the steps of: receiving a plurality of signals from a computer input pointing device, the signals representing quantifiable pressure and motion in at least three dimensions; and outputting a video, audio or tactile output signal to the output device, the output signal having at least two characteristics that are capable of being varied separately from each other.
  • Still another aspect of the invention is a method for operating a computer input pointing device, comprising the steps of: moving the pointing device; transmitting at least three motion signals from the pointing device to a computer, the motion signals including signals representing translations in one or more dimension, or rotations in one or more dimensions, or a combination of translations and rotations; squeezing at least one portion of the pointing device; and transmitting quantifiable pressure signals from the pointing device to the computer.
  • FIG. 1 is an isometric view of an exemplary squeezable pointing device according to the present invention.
  • FIG. 2 is an exploded view of the pointing device of FIG. 1.
  • FIG. 3A is an enlarged view of the origami sensor of FIG. 2.
  • FIG. 3B is a plan view of the sensor of FIG. 3 A, prior to folding.
  • FIG. 4A is a plan view of a tetrahedral origami sensor in a flattened state.
  • FIG. 4B is an isometric view of the sensor of FIG. 4A, folded.
  • FIG. 5A is a plan view of an alternative cubic origami sensor in a flattened state.
  • FIG. 5B is an isometric view of the sensor of FIG. 5A, folded.
  • FIG. 6 is an isometric view of a tetrahedral edge sensor.
  • FIG. 7 is an elevation view of a computer system including the pointing device of FIG. 1.
  • FIG. 8 is a diagram of a range of motion tree for an application program executed on the computer shown in FIG. 7.
  • FIG. 9 shows the software layers for the application executed on the computer shown in FIG. 7.
  • FIG. 10 is a flow chart of a method for operating the virtual puppet of FIG. 7.
  • FIGS. 11-13 are block diagrams of the software layer structure shown in FIG. 9.
  • FIG. 14 is a flow chart of a method for operating the pointing device of FIG. 2.
  • FIG. 1 shows an exemplary computer input pointing device 100 according to the present invention.
  • Pointing device 100 is capable of detecting motion and providing a quantifiable measure of squeeze pressure.
  • the input device 100 is operable in a hand of a user, without contacting a base, work surface, or pad. Thus, the input device 100 is "free floating," in that it may be operated in any position without regard to the presence of a base, work surface, or pad.
  • Pointing device 100 has a squeezable, deformable elastomeric cover 102. Inside the cover 102, the pointing device 100 has an internal motion detector that may measure acceleration in x, y and z planes, or measure pitch, yaw and roll. Squeezing the package increases pressure that translates into a pressure signal that is sent by the pointing device 100 to the computer (shown in FIG. 7) along with the motion data.
  • FIG. 2 is an exploded view of the computer input pointing device 100, with the squeezable elastomeric covering 102 partially removed.
  • the squeezable covering 102 may be formed from natural or synthetic rubber or a variety of polymeric materials, which may be cut or injection molded.
  • Cover 102 has an internal cavity 104 for containing sensors.
  • An optional sensor enclosure 106 may be provided inside of cover 102.
  • the sensor enclosure 106 may be sealed to protect the sensors from dust and moisture, allowing use of an unsealed cover 102.
  • the deformable cover 102, with or without the sensor enclosure 106 allows the sensor to move freely, and compresses internal sensor.
  • the squeezable cover is made from a "SQEESHTM” ball by the Toysmim Company of Kent, WA USA.
  • "SQEESHTM” balls are available in a variety of shapes and surface types, including smooth spherical, spherical with craters, and star shaped. A variety of shapes may be used to provide users with the grip of their choice or hints as to what object that the device controls.
  • Pointing device 100 has at least one motion detector 108, which is capable of detecting motion in at least three dimensions.
  • Pointing device 100 also includes at least one pressure sensor capable of sensing pressure quantitatively.
  • the pressure sensing capability is also provided by the sensor 108, which is described in greater detail below.
  • the at least one motion detector may, for example, include an accelerometer or a piezoelectric device.
  • the exemplary piezoelectric device 108 includes at least three piezoelectric strips 301-303 or flaps oriented in three orthogonal directions. More specifically, the exemplary device 108 is a single film sensor having three strips 301-303 folded origami style, as shown in FIG. 3A, to form a piezo thin film with six sensors.
  • FIG. 3B is a view of the device 108 with six sensors 314, 316.
  • the device is formed on a substrate 300 and is shown laid flat before the origami folds along dashed lines 312.
  • Optional weight W1-W3 may be attached to the end or center of each flap 301-303, respectively.
  • each of the three piezoelectric strips 301-303 has at least two sections, including a weighted section 314 and an unweighted section 316; so as to discriminate motion load from pressure load.
  • the piezo film substrate 300 is plastic and allows two strips of piezo material to measure bend of the substrate by the relative tension between two strips.
  • One piezo sensor 316 is provided for pressure discrimination. The pressure on the deformable covering 102 bends the material of the piezo pressure sensor 316, producing a measurable voltage.
  • Another piezo sensor 314 for motion discrimination has a weight W1-W3 placed at the middle or end of the sensor. If each of the three piezoelectric strips 314 has a respective weight; moving the pointing device 100 induces an inertial force. The weight W1-W3 moves against the material of the motion sensor 314, causing it to bend at a different rate than pressure sensor 316.
  • Cuts 318 on the exemplary piezo film substrate 300 allow each sensor flap 314, 316 to bend separately.
  • the dotted lines 312 show folds where the flat sensor strip 300 is origami folded in 90 degree bends.
  • the folded origami flaps 301-303 are affixed to a support assembly 320 (shown in FIG. 3A).
  • the support assembly 320 holds the origami flaps 301-303 in place.
  • the leads L from thin film sensor are connected to optional internal multiplexing circuitry 110 that connects the sensor circuitry to an external lead 112.
  • the sensors output data representing four parameters (three motion and one pressure).
  • a multiplexer 110 receives output signals from the motion detector and the pressure sensor 108, and outputs a multiplexed signal for transmission to a computer (shown in FIG. 7).
  • the output of the multiplexer is connected to a lead 112.
  • the lead 112 extends through a channel 103 in the elastomeric covering 102.
  • the opposite end of the lead 112 has a suitable connector 114 attached to it, which may be a 0.32-centimeter (1/8 inch) miniplug, a Universal Serial Bus (USB) connector, a serial (RS-232) port, an "ETHERNETTM” port, a MIDI port for connection to a musical instrument, a PDA port, a cellular phone port or other suitable connector.
  • a suitable connector 114 attached to it, which may be a 0.32-centimeter (1/8 inch) miniplug, a Universal Serial Bus (USB) connector, a serial (RS-232) port, an "ETHERNETTM” port, a MIDI port for connection to a musical instrument, a PDA port, a cellular phone port or other suitable connector.
  • a hub capable of accepting inputs from multiple input devices may be provided to interface the plurality of pointing devices 100 to a single computer.
  • a USB hub 116 (connected to lead 118 and USB connector 120) may be used to connect the pointing devices to a single computer.
  • the pointing device may omit the multiplexer, and output four analog signals from four respective output terminals.
  • the pointing device 100 may have a wireless transmitter for communicating with the computer.
  • the lead 112 from the sensing package either connects one data line (carrying time or frequency multiplexed data), or multiple data lines each carrying a respective sensor's output.
  • the connector (such as 1/8" miniplug) 114 optionally connects directly to the computer (FIG. 7) through a microphone input eliminating the need for a hub.
  • the pointing device 100 may use the audio software driver or communicate through the optional hub 116 for controlling multiple puppets (as described below).
  • the connector 114 may connect to a different computer port such as a USB or serial port, eliminating the need for a hub.
  • FIG. 2 shows a two-input version, the system may include any number of inputs.
  • any of the sensor types described herein may be used with clock shifting to the next lead.
  • An additional clock cycle may be provided for subsequent processing of an optional unique identification number (described below).
  • the sensors provide an analog signal that may be modulated with a single carrier frequency, or the signal from each lead may be modulated with a respectively different carrier frequency.
  • An optional analog to digital converter may be provided for each of the analog sensor leads.
  • ADC analog to digital converter
  • a clock shift to the next lead sends a digital packet. For example, with eight bits per lead, which for 6 sensors yields six bytes per sample cycle.
  • the ID circuit may be a register or a read only memory (ROM) device, EEPROM, or the like.
  • the unique identifier may be hard encoded at the factory (e.g., using a ROM), or the identifier may be downloaded from a computer into a flash EEPROM in the pointing device 100. As explained below with reference to FIG. 7, by providing a unique identifier for each pointing device unit 100, more than one unit may be used at the same time.
  • the device identifier can function like an encryption key, to protect the user's preferences or to securely identify the user.
  • the output of the LD circuit and the output from one of the sensors are output to the multiplexing circuitry 110.
  • the analog input may be time-domain multiplexed with the ID waveform.
  • a digital input packet may be combined with ID information to form a larger packet including both.
  • the pointing device 100 may contain additional identifier circuitry to add a port identifier number separate from the pointing device identifier number.
  • the analog waveform may have a separate carrier frequency or may be time domain multiplexed or interleaved with each sample, either on a bit, byte, or packet basis.
  • the sensor data and ID information may be added to a multiplexing packet to form a larger packet including both types of data.
  • the pointing device 100 may obtain power from the computer (FIG. 7), from an internal battery, or by drawing power directly from the energy created by the sensor from the piezoelectric effect.
  • the piezoelectric device may be an origami sensor having cut coils, and may have either a cubic or tetrahedral shape.
  • the cut coils on each face of the motion detector are attached to a single weight at a center of the motion detector.
  • each cut coil on each face of the motion detector has its own weight attached.
  • FIGS. 5 A and 5B show a sensor system 500 including origami cutouts, which may be used in place of the sensor 108 shown in FIG. 2.
  • a thin sensor film 500 having six faces 501- 506 is folded to form a cube (although a tetrahedron, shown in FIGS. 4A and 4B, or other three- dimensional shapes may be used in alternative embodiments).
  • a spiral pattern 511-516 is cut into each face of thin sensor film 501-506, respectively.
  • Each coil can be pressed towards the center (e.g., Cl, C3 and C6) and attached to a weight that is suspended at the center of the folded shape.
  • each face 501-506 has a corresponding opposite face measuring force with an inverse effect on voltage.
  • each side faces three other sides, measuring an inverse effect on their faces.
  • Additional circuitry may be included in the exemplary sensor configuration.
  • An analog sample and hold is provided for each of the analog sensor leads.
  • the leads are:
  • the leads are:
  • the leads are:
  • FIG. 6 shows a tetrahedral edge sensor 600.
  • Sensor 600 has six edges to detect pressure directly from a squeeze of the deformable elastomeric material of cover 102.
  • sensor 600 may have weighted edges 602 to detect motion through inertial force whenever the pointing device 100 is moved.
  • the weights 610 increase the momentum of the sensors 602, so that the deformation of the sensors is exaggerated.
  • weights may be placed at each node 608.
  • Sensor 600 is used in conjunction with an application driver that can discriminate pressure from motion.
  • the driver can discriminate pitch, yaw, and roll.
  • a plurality of signal leads 604 are provided, one for each edge of the tetrahedron 600.
  • the outputs 604 of the sensor either connect to multiplexing/conversion circuitry 110 or lead out to separate analog lines.
  • each connecting node 608 broadcasts a wireless signal of its state without requiring any lead wire.
  • non-sensing leads connect each node 608 to another edge 602.
  • the non-sensing leads 606 either follow a sensing edge 602 to another node 608 or may run directly to another node 608.
  • the connecting nodes 608 for leads 606 and sensing edges 602 may simply be vertices for an origami tetrahedron or connecting nodes for self supporting edge sensors.
  • a sensor similar to that used in the "GYROPOENTTM" mouse may be used.
  • a sensor such as the "ACH-04-08-05” accelerometer/shock sensor from Measurement Specialties, Inc. of Valley Forge PA may be used.
  • the ACH-04-08-05 sensor has three piezoelectric sensing elements oriented to measure acceleration in x, y and z linear axes, and uses very low power.
  • an "ACH-04-08-01" sensor by Measurement Specialties, Inc. may be used to measure linear acceleration in the x and y directions and rotation about the z axis.
  • pointing device 100 also includes a pressure sensing capability.
  • An internal pressure sensor is included, which may be, for example, a thin film piezo sensor that detects a bending of a strip or a sensor such as the "MTC ExpressTM” from Tactex Controls Inc. of Cerritos, CA.
  • This control surface is pressure sensitive, and also senses multiple points of contact, using a fiber optic-based, pressure sensitive, Smart Fabric called “KTNOTEXTM.”
  • This fabric is constructed of cellular urethane or silicone sandwiched between protective membranes. The fabric has an embedded fiber optic array that generates an optical signal when the material is touched, or when force is applied. The coordinates and area of the pressure, and its magnitude can be determined from the received signals.
  • the material can generate a pressure profile of the shape of a hand touching it.
  • the pointing device 100 may include a circuit that outputs a unique identifier for each pointing device.
  • the feature allows the pointing device 100 to serve a user identification and/or authentication function.
  • a user could carry around his or her pointing device with a unique identifier output signal, and connect it to any desired computer having an Internet connection.
  • the application program could request a remote server to download previously established software preferences or upload the user's current software preferences. This enables a person to log on at any computer connected to the Internet, access his or her software (e.g., virtual puppet and stage application), or download web bookmarks.
  • the user could use the pointing device 100 as a key to access copy protected software, functioning like a parallel port dongle.
  • the use of a pointing device 100 with a built-in unique identifier enhances the user's mobility and access to services.
  • the pointing device may be used for authentication at any computer to which it is connected.
  • the unit uniquely identifies itself, the unit can be used in a way that allows it to bind to each individual user.
  • the unique identifier may be used in combination with a password, IP address or a user signature.
  • the user may sign his or her name using the pointing device to control movement of a virtual writing implement.
  • the user's handwriting may be recognized using a conventional handwriting recognition algorithm, such as any of those described in U.S. Patents 5768423, 5757959, 5710916, 5649023 or 5553284, all of which are expressly incorporated by reference herein in their entireties.
  • the combination of the unique device identifier and the user signature provides reliable authentication.
  • the unique identifier may be used to make the user's preferences more portable.
  • the identifier may be associated with the design (i.e., shape, color, and the like) of the cover 102.
  • the identifier for the pointing device may also be associated with different shapes and colors of cursors.
  • the cursor that appears on the screen may have the same shape or color as the cover 102 of the input device 100, which is associated with a known identifier.
  • a blue spherical pointing device may be associated with a round, blue cursor
  • a puppet shaped pointing device may be associated with a puppet shaped cursor
  • an alien-shaped pointing device may be associated with an alien-shaped cursor, and so on.
  • the input device packaging may be completely different from the shape and color of the computer cursor, or puppet object.
  • the puppet application program is stored in a server that is accessible via the Internet, or other local area network (LAN) or wide area network (WAN) or wireless services such as CDMA, the user may select from any of a plurality of different puppets from a gallery. This is one of the user preferences that may follow her around from client computer to client computer.
  • FIG. 7 shows a computer system 700 according to another aspect of the invention.
  • the computer system 700 includes a computer 701 having a processor 702, a memory 706, a display 708, and an input port 712.
  • At least one computer input pointing device 100 is coupled to the input port 712 of the computer 701.
  • the pointing device 100 has at least one motion detector 108, capable of detecting motion in at least three dimensions, and at least one pressure sensor capable of sensing pressure quantitatively.
  • a multiplexer 110 receives output signals from the motion detector 108 and the pressure sensor, and outputs a signal for transmission to the computer 701. Alternatively, the lead from each sensor may output an analog signal.
  • the computer 701 further comprises a storage device 704, which may be a hard disk drive, a high capacity removable disk drive (e.g., "ZIP” drive by Iomega Corporation), a read only memory (ROM), CD-ROM drive, a digital versatile disk (DVD) ROM drive, or the like.
  • a storage device 704 which may be a hard disk drive, a high capacity removable disk drive (e.g., "ZIP” drive by Iomega Corporation), a read only memory (ROM), CD-ROM drive, a digital versatile disk (DVD) ROM drive, or the like.
  • Either the memory 706 or the storage device 704 has stored therein computer program code for causing the computer to show an object 710 on the display 708.
  • the object has a plurality of portions 710a-710d that are selectively movable in response to motion or squeezing of the pointing device 100.
  • the exemplary object is an animated character 710 controlled by input parameters from the pointing device. Such an animated character is
  • a method for controlling a computer. The method includes the steps of: receiving a plurality of signals from a computer input pointing device 100, the signals representing quantifiable pressure and motion in at least three dimensions; and outputting a video, audio or tactile output signal to the output device 100, the output signal having at least two characteristics that are capable of being varied separately from each other.
  • FIG. 10 shows an exemplary method for operating the computer 701. The method includes the steps of:
  • the animated character 710, figure or other object is displayed on the display 708.
  • the driver module 904 updates the idealized pressure value Pi and translates Pi into the normalized pressure value P.
  • the application program processes the normalized pressure P into an animated figure action.
  • the application updates the animated figure according to the action, and displays the result on the display 708.
  • steps 1070-1080 are executed.
  • the driver module 904 of the software updates the idealized motion vector Mxi, Myi, Mzi and translates the vector into a normalized motion vector Mx, My, Mz.
  • the application program processes the motion vector Mx, My, Mz into animated figure motion, and outputs the moving figure to the display 708.
  • An exemplary method for operating the pointing device 100 comprises the steps of:
  • the motion signals including signals representing translations in one or more dimension, or rotations in one or more dimensions, or a combination of translations and rotations;
  • all parameters measured by the sensors 108 in the pointing device 100 are mapped to motions of the virtual puppet 710, such as mouth moving open and closed, the motions of limbs, or the expression of emotional intensity.
  • the pressure measurement is particularly suitable for use in providing quantitative inputs to the computer to create results having continuously variable intensity.
  • a light squeeze may result in a small kick, whereas a hard squeeze results in a high kick.
  • FIG. 7 includes a hub 116 and at least a second computer input pointing device 100 capable of sensing motion in at least three dimensions and a pressure sensor. Both the first and second computer input pointing devices 100 are connected to the hub 116, which in turn is connected the input port of the computer 701. Each of the first and second computer input pointing devices 100 controls a respective virtual puppet 710 and 711.
  • the computer program code includes a range of motion tree 800 for the virtual puppet 710.
  • the range of motion tree 800 restricts a range of possible motion of the virtual puppet 710 about a current position of the virtual puppet.
  • the computer program code includes a respective unique range of motion tree for each of a plurality of users.
  • the exemplary range of motion tree 800 covers head, left and right shoulder, left and right elbow, left and right wrist, and left and right grip.
  • the tree limits the range of motion from each node.
  • the nodes may be delimited as follows:
  • the limits may be expressed in rectangular coordinates, such as (x0,y0,z0), (xl,yl,zl). Alternatively, the limits may be expressed as delta ranges. For some objects, it may be more convenient to express the limits in polar coordinates, which have a 1:1 mapping to rectangular coordinates.
  • the range of motion tree 800 can express a hierarchy of motions.
  • State information restricts the range of possible motion about the last position of puppet.
  • the state information is mapped to a user model, and further restricts possible ranges of motion. For example,
  • a user model adds probabilities for motion at each node of the tree. For example, the user model may identify whether the user moves her shoulders or shakes her wrists. In an exemplary user model for "User n”, the probabilities may be expressed as:
  • the probabilities of motion may be mapped to a voxel space.
  • the probabilities could be precompiled with user probabilities, where three-dimensional motion space is broken up into small cubes, analogous to three-dimensional pixels.
  • Each voxel point can be ascribed a probability, and each voxel can contains a range of motion probability.
  • Voxels may also include application restrictions on the range of motion probability.
  • Another task for the software is identifying the orientation of the pointing device, to allow use of the orientation as a parameter.
  • conformation of curves is used as a process of matching the signal curve over time from the three-dimensional sensor space to three- dimensional application space. This is analogous to a three-dimensional jigsaw puzzle or Chinese wood block puzzle, in which a piece may be rotated in any of three dimensions.
  • voxel space it is possible to constrain the orientation of the sensor motion curve to only possible ranges from the "range of motion tree" 800 to only allow possible ranges from the application space. It is further possible to find a most likely orientation of sensor motion curve based on probabilities of each motion represented as likelihood coefficients for each voxel point.
  • Orientation can be determined by software.
  • the exemplary software uses conformation mapping of curves of X, Y, Z (and optionally P) to the range of motions of either the left or right hand. Mapping could emanate from an explicit model or from trainable classifiers, such as neural nets.
  • An application may request the user to orient the pointing device 100 by holding the device in his or her left or right hand and selecting items on the display 708.
  • the user either controls an object on the display 708, or navigates in space, where the user acts as the object, with a point of view that shifts.
  • the range of object motion may be less than the range of motion of the pointing device 100.
  • the software maps the actual gesture of the input device to the best fitting object motion.
  • a simple example is controlling a cartoon character that merely has a range of motion of stage-left and stage-right.
  • the range of motion of the puppet is constrained to two dimensions.
  • Controlled objects on the display 708 generally have a limited destination range.
  • the application can show a picture of the pointing device 100 with the lead 112 pointing down to the floor.
  • One hemisphere of the squeezable, deformable cover may have special markings that faces away from the display 708. This technique eliminates the need for conformation of curves, since the orientation is 1:1, puppet to application.
  • FIG. 9 is a block diagram showing the virtual puppet modules and their connections to each other and the underlying software layer.
  • the user module 900 contains the states of:
  • a sensor module 902 has processes A, B, C, D corresponding to each sensor output, whether analog or digital.
  • a driver module 904 translates sensor values to application values.
  • the driver module 904 may use a trainable classifier, such as a neural network, where each sensor output has correlating inputs to the drivers idealized X, Y, Z, P.
  • the driver module 904 correlates user states to range of sensor input. The states, “grasp,” “orient,” “squeeze,” “move,” and “release” each have a characteristic effect on each of the sensors.
  • “grasp” may have a pressure curve, as the user grabs and picks up the puppet.
  • “squeeze” may have a longer continuous pressure curve when compared to grasp.
  • Each instance of state may have separate set of coefficients for a trainable classifier.
  • the software correlates the application state to the domain of sensor output, to reduce the set of possible outcomes. For instance, a two-dimensional stage for cartoon characters may be limited to "stage left” and “stage right”.
  • the program updates the user module state from the application state and application position.
  • the application module 906 accepts X, Y, Z, and P data from the driver module 904, and reports the application X, Y, Z, P state to the driver module as state information.
  • the application module 906 reports user state information to pass from the driver module 904 to the user module 900.
  • a mouse module 908 is a special program for emulating a mouse.
  • the mouse module 908 uses a special application module for the report state.
  • Mouse module 908 translates X, Y movement of the pointing device 100 to mouse XN motion, and translates Z and P parameters of the pointing device 100 to mouse Left, Right, and double click.
  • FIGS. 11-13 show the transformations that are performed by various exemplary software modules, depending on the specific types of sensors 902 included in the pointing device 100.
  • FIG. 11 covers the example described above, in which the motion detector measures x, y, and z accelerations, and pressure is also measured.
  • the sensors 902 include motion sensors 1100 and pressure sensor 1110.
  • a program Sm 1120 translates the raw output of sensor 1100 to an idealized motion vector, Mxi, Myi, Mzi, 1140 that is provided to the driver 904.
  • a second translator Tm 1160 translates the idealized motion to an application motion vector Mx, My, Mz 1180, which is provided to the application 906.
  • a program Sp 1130 translates the raw output of sensor 1110 to an idealized pressure Pi, 1150 that is provided to the driver 904.
  • a second translator Tp 1170 translates the idealized pressure Pi to an application pressure P 1190, which is provided to the application 906.
  • FIG. 12 covers a second example, in which an origami detector measures x, y, and z accelerations, and pressure.
  • the sensors 902' include unweighted sensors 1200 and weighted sensors 1210.
  • a program Sm 1220 discriminates between unweighted and weighted sensors to produce an idealized motion vector, Mxi, Myi, Mzi, 1240 that is provided to the driver 904'.
  • a second translator Tm 1260 translates the idealized motion to an application motion vector Mx, My, Mz 1280, which is provided to the application 906'.
  • a program Sp 1230 discriminates between unweighted and weighted sensors to produce an idealized pressure Pi, 1250 that is provided to the driver 904'.
  • a second translator Tp 1270 translates the idealized pressure Pi to an application pressure P 1290, which is provided to the application 906'.
  • FIG. 13 covers a third example, in which a tetrahedral detector measures x, y, and z accelerations; x, y and z rotations; and pressure.
  • the sensors 902" include tetrahedral sensor 1300.
  • a program Sm 1310 translates the output of sensor 1300 to an idealized motion vector, Mxi, Myi, Mzi, 1340 that is provided to the driver 904".
  • a second translator Tm 1370 translates the idealized motion to an application motion vector Mx, My, Mz 1371, which is provided to the application 906".
  • a third translator Sr 1320 translates the sensor information to an idealized rotation vector Rxi, Ryi, Rzi.
  • a fourth translator Tr 1380 translates the idealized rotation Rxi, Ryi, Rzi to an application rotation Rx, Ry, Rz 1350, which is provided to the application 906".
  • a fifth translator Sp 1330 translates the output of sensor 1300 to an idealized pressure Pi, 1360 that is provided to the driver 904".
  • a sixth translator Tp 1390 translates the idealized pressure Pi to an application pressure P 1391, which is provided to the application 906".
  • FIG. 14 is a flow chart diagram of an exemplary method for operating the pointing device 100.
  • the system waits for the user to take an action, i.e., a motion or a squeeze.
  • the system returns to the wait state 1419 after each action, and can transition from the wait state 1419 to any of the states, "grasp” 1420, "orient” 1432, “squeeze” 1437, “move” 1442 or “release” 1447.
  • the user grasps the device.
  • the sensor 902 registers the motion.
  • the driver module 904 wakes up in "grasp mode".
  • step 1423 if this is the first session for the pointing device 100, then step 1424 is executed, and the device is calibrated. Control is transferred to step 1425. At step 1425, if this is the first time the user is using the pointing device 100, then at step 1426, the user registers himself or herself. An association is thus formed between the user and the particular pointing device, making use of the (optional) unique identifier chip in the pointing device 100. Control is transferred to step 1427. At step 1427, if a login is required, then step 1428 is executed. At step 1428, the user logs in, either by using the keyboard, or by a unique motion with the pointing device 100 (such as writing the user's signature). Control is transferred to step 1429.
  • step 1429 the system checks whether there are user-specific coefficients available on the system for translating this specific user's style of motion, rotation and pressure from raw sensor data to idealized measurement data. If data are available, then step 1430 is executed. At step 1430, the default coefficients are replaced by the previously determined user-specific coefficients. Control is transferred to step 1431. At step 1431, the user login is translated to rotation and orientation coefficients. The system returns to the wait state 1419
  • step 1432 the user orients the device.
  • a rotation calibration is initiated by asking the user to orient the device in one or more predetermined positions.
  • step 1434 a stream of data points are recorded.
  • step 1435 conforming X, Y and Z rotations are constructed.
  • step 1436 a determination is made whether sufficient data have been collected to conform the rotation measurements to the actual rotations. If not, then control is returned to step 1433. Otherwise, the system returns to the wait state 1419.
  • step 1437 the user squeezes the pointing device 100.
  • the driver module 904 updates the idealized pressure Pi, and translates the same to the normalized pressure P.
  • step 1439 the pressure is provided to the application, which processes the pressure.
  • step 1440 if the application is finished with processing the squeeze operation, then control returns to step 1419 (the wait state). If not, then step 1441 is executed, to determine whether driver module 904 detects a lull or pause in the squeezing. If there is a pause, then control is transferred to step 1419 (wait state). Otherwise, step 1438 is executed again.
  • step 1442 the user moves the pointing device 100.
  • the driver module 904 updates the idealized motion vector Mxi, Myi, Mzi, and translates the same into the normalized motion vector Mx, My, Mz.
  • the application program processes the normalized motion vector Mx, My, Mz.
  • step 1445 if the application is done with processing the movement of the pointing device 100, then control transfers to step 1419 (wait state). Otherwise, step 1446 is executed, to determine whether the driver detects a lull in the motion. If there is a pause, then control is transferred to step 1419 (wait state). Otherwise, step 1443 is executed again.
  • the user releases the pointing device 100.
  • the driver module 904 enters a sleep state.
  • the application is notified that the pointing device has entered the sleep state.
  • a set of pseudo-code that may be used to construct a software system suitable for handling input signals from the exemplary pointing device 100.
  • A, B, C, D, E, F, G sensor strip oriented along a sensing plane
  • Tm translate idealized motion to application motion
  • Tp translate idealized pressure to application pressure
  • Tr translate idealized rotational motion to application rotational motion
  • Tm (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz)
  • Tp Pi maps to P Origami sensor with ABC flaps with each flap having an un-weighted (u) and weighted (w) section
  • Tm (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz)
  • Tm (Mxi, Myi, Mzi) rotationally maps to (Mx, My, Mz) Tr: (Rxi, Ryi, Rzi) rotationally maps to (Rx, Ry, Rz)
  • Tp Pi maps to P
  • Registration after calibration ask user to create login signature determine cultural orientation left-right vs. right-left, etc. accept user's input while software collects ABCD points translate input to Ox, Oy, Oz over time
  • Login ask user to reproduce login signature set orientation based upon cultural left-right vs. right-left context replaces manual orientation to draw shape on X-Y axis
  • the output of the pointing device may be mapped to the parameters of a visual synthesizer, such as the shifting of a color palette, the drawing or motion of an object.
  • the output of device 100 may be mapped to parameters of a sound synthesizer, such as pitch, timbre, amplitude, or placement in three-dimensional sonic space.
  • the exemplary virtual puppet application is configured so that one pointing device 100 controls a single virtual puppet 710
  • the software can be configured to operate more than one virtual puppet (or other object) using a single input pointing device 100.
  • the pointing device includes sensors for measuring x, y, and z accelerations and pressure P
  • the first puppet 710 may be controlled by the x and y accelerations
  • the second puppet 711 may be controlled by z and P.
  • the first and second objects need not be identical objects, or even objects of the same type.
  • the first object may be a virtual puppet
  • the second object may be the background scenery or sound.
  • the puppet 710 is controlled, and by z and P, the user can vary the background from day to night, or from silence to the sound of thunder.
  • software applications can us the Mouse API for selecting or navigating or use a Driver API.
  • the driver for Digitizer (or Graphics) tablets may allow more than one pen to be used at the same time.
  • Emulating a mouse can be accomplished by installing a driver and going beyond a mouse can be accomplished by modifying the application program.
  • One can also overlay other input devices, such as a graphics tablet, with yet another driver.
  • the output signals from the exemplary pointing device 100 may be mapped to those of conventional input devices in an emulation mode.
  • the outputs may be mapped to the x and y positions of a conventional mouse and the left or right click signals from depressing the mouse buttons.
  • Mapping the x, y and z (or x, y and P) parameters into x and y parameters only may require the computer to determine the orientation of the pointing device 100, such as moving on a predominantly x-y plane, a y-z plane, or the z-x plane, or combinations of all three.
  • individual user profiles may be used to determine the current orientation.
  • a second pointing device may be used to expand the number of parameters for an application.
  • one pointing device 100 may control various aspects of the video signal, while another pointing device may be used to control the audio signals.
  • a microphone may be incorporated into the pointing device, for operation of voice activated software, or for storage of sounds.
  • the exemplary embodiment described above has the virtual puppet application program running locally in the computer 701, the application program may be executed in other processors located remotely, either as a sole process or in parallel with other instances of the application program on other processors.
  • the local computer can communicate with other computers via a telecommunication network, which may be a LAN, WAN, Internet, or wireless protocols.
  • the present invention may be embodied in the form of computer-implemented processes and apparatus for practicing those processes.
  • the present invention may also be embodied in the form of computer program code embodied in tangible media, such as floppy diskettes, read only memories (ROMs), CD-ROMs, hard drives, "ZLPTM drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention.
  • the present invention may also be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over the electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention.
  • computer program code segments configure the processor to create specific logic circuits.
EP00926157A 1999-04-20 2000-04-20 Vorrichtung zur eingabe von menschlichen gesten mittels bewegung und druck Withdrawn EP1250698A4 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13019199P 1999-04-20 1999-04-20
PCT/US2000/010579 WO2000063874A1 (en) 1999-04-20 2000-04-20 Human gestural input device with motion and pressure
US130191P 2008-05-29

Publications (2)

Publication Number Publication Date
EP1250698A2 EP1250698A2 (de) 2002-10-23
EP1250698A4 true EP1250698A4 (de) 2002-10-23

Family

ID=22443485

Family Applications (1)

Application Number Title Priority Date Filing Date
EP00926157A Withdrawn EP1250698A4 (de) 1999-04-20 2000-04-20 Vorrichtung zur eingabe von menschlichen gesten mittels bewegung und druck

Country Status (3)

Country Link
EP (1) EP1250698A4 (de)
AU (1) AU4473000A (de)
WO (1) WO2000063874A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8257439B2 (en) 2004-12-22 2012-09-04 Ldr Medical Intervertebral disc prosthesis

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7749089B1 (en) 1999-02-26 2010-07-06 Creative Kingdoms, Llc Multi-media interactive play system
GB2358108A (en) 1999-11-29 2001-07-11 Nokia Mobile Phones Ltd Controlling a hand-held communication device
US7445550B2 (en) 2000-02-22 2008-11-04 Creative Kingdoms, Llc Magical wand and interactive play experience
US7878905B2 (en) 2000-02-22 2011-02-01 Creative Kingdoms, Llc Multi-layered interactive play experience
US6761637B2 (en) 2000-02-22 2004-07-13 Creative Kingdoms, Llc Method of game play using RFID tracking device
US7066781B2 (en) 2000-10-20 2006-06-27 Denise Chapman Weston Children's toy with wireless tag/transponder
DE10203209B4 (de) * 2002-01-28 2004-08-12 Siemens Ag Vorrichtung mit einem Eingabeelement zur Steuerung einer Anzeigeeinheit
US6990639B2 (en) 2002-02-07 2006-01-24 Microsoft Corporation System and process for controlling electronic components in a ubiquitous computing environment using multimodal integration
US6967566B2 (en) 2002-04-05 2005-11-22 Creative Kingdoms, Llc Live-action interactive adventure game
US20070066396A1 (en) 2002-04-05 2007-03-22 Denise Chapman Weston Retail methods for providing an interactive product to a consumer
KR100932483B1 (ko) * 2002-11-20 2009-12-17 엘지전자 주식회사 이동통신 단말기 및 이것을 이용한 아바타 원격 제어 방법
US9446319B2 (en) 2003-03-25 2016-09-20 Mq Gaming, Llc Interactive gaming toy
FR2858073B1 (fr) * 2003-07-24 2007-08-10 Adentis Procede et systeme de commande gestuelle d'un appareil
FI117308B (fi) 2004-02-06 2006-08-31 Nokia Corp Eleohjausjärjestelmä
JP4805633B2 (ja) 2005-08-22 2011-11-02 任天堂株式会社 ゲーム用操作装置
US7927216B2 (en) 2005-09-15 2011-04-19 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
US8870655B2 (en) 2005-08-24 2014-10-28 Nintendo Co., Ltd. Wireless game controllers
JP4262726B2 (ja) 2005-08-24 2009-05-13 任天堂株式会社 ゲームコントローラおよびゲームシステム
JP4907128B2 (ja) * 2005-08-30 2012-03-28 任天堂株式会社 ゲームシステムおよびゲームプログラム
WO2007030947A1 (en) * 2005-09-16 2007-03-22 Anthony Szturm Mapping motion sensors to standard input devices
EP1821180A1 (de) 2005-12-31 2007-08-22 Ball-IT Oy Benutzerseitig bedienbares Zeigegerät wie eine Maus
JP5204381B2 (ja) * 2006-05-01 2013-06-05 任天堂株式会社 ゲームプログラム、ゲーム装置、ゲームシステム及びゲーム処理方法
US9390229B1 (en) 2006-04-26 2016-07-12 Dp Technologies, Inc. Method and apparatus for a health phone
US8902154B1 (en) 2006-07-11 2014-12-02 Dp Technologies, Inc. Method and apparatus for utilizing motion user interface
GB0700163D0 (en) * 2007-01-05 2007-02-14 Kord Ali A Display Booth
US8949070B1 (en) 2007-02-08 2015-02-03 Dp Technologies, Inc. Human activity monitoring device with activity identification
EP2132650A4 (de) * 2007-03-01 2010-10-27 Sony Comp Entertainment Us System und verfahren zur kommunikation mit einer virtuellen welt
US8555282B1 (en) 2007-07-27 2013-10-08 Dp Technologies, Inc. Optimizing preemptive operating system with motion sensing
US7800044B1 (en) 2007-11-09 2010-09-21 Dp Technologies, Inc. High ambient motion environment detection eliminate accidental activation of a device
US8996332B2 (en) 2008-06-24 2015-03-31 Dp Technologies, Inc. Program setting adjustments based on activity identification
JP5759375B2 (ja) * 2008-08-12 2015-08-05 コーニンクレッカ フィリップス エヌ ヴェ 動き検出システム
US8300020B2 (en) 2008-08-15 2012-10-30 Apple Inc. Hybrid inertial and touch sensing input device
US8872646B2 (en) 2008-10-08 2014-10-28 Dp Technologies, Inc. Method and system for waking up a device due to motion
US9529437B2 (en) 2009-05-26 2016-12-27 Dp Technologies, Inc. Method and apparatus for a motion state aware device
WO2012079948A1 (en) * 2010-12-16 2012-06-21 International Business Machines Corporation A human interface device with two three-axis-accelerometers
FR2972819A1 (fr) * 2011-03-15 2012-09-21 France Telecom Dispositif de capture de donnees representatives d'une reaction d'un utilisateur face a une situation, dispositif de traitement et systeme de detection et de gestion associes
US10248205B2 (en) 2013-04-02 2019-04-02 Nokia Technologies Oy Apparatus for recording audio and vibration content of event
CN104142823B (zh) 2014-06-30 2016-04-27 腾讯科技(深圳)有限公司 一种输入设备及控制系统
CN106595697B (zh) * 2016-12-15 2023-07-11 宁夏农垦贺兰山奶业有限公司 压电式计步器及计步方法
WO2019073490A1 (en) * 2017-10-12 2019-04-18 Glenn Fernandes 3D MOUSE AND ULTRA-FAST KEYBOARD
US20200133386A1 (en) * 2018-06-13 2020-04-30 Koofy Innovation Limited Input surface system
FR3095957A1 (fr) * 2019-05-15 2020-11-20 The Cube Company Dispositif électronique de pilotage interactif d’une application logicielle installée sur un terminal de communication
US20230338827A1 (en) * 2022-04-25 2023-10-26 Sony Interactive Entertainment Inc. Correlating gestures on deformable controller to computer simulation input signals

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997027575A1 (en) * 1996-01-24 1997-07-31 Personal Integrated Network Node Communications Corporation Smart orientation sensing circuit for remote control
WO1997039401A1 (en) * 1996-04-12 1997-10-23 Paul Milgram Finger manipulatable 6 degree-of-freedom input device
US5813406A (en) * 1988-10-14 1998-09-29 The Board Of Trustees Of The Leland Stanford Junior University Strain-sensing goniometers, systems and recognition algorithms

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5757360A (en) * 1995-05-03 1998-05-26 Mitsubishi Electric Information Technology Center America, Inc. Hand held computer control device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5813406A (en) * 1988-10-14 1998-09-29 The Board Of Trustees Of The Leland Stanford Junior University Strain-sensing goniometers, systems and recognition algorithms
WO1997027575A1 (en) * 1996-01-24 1997-07-31 Personal Integrated Network Node Communications Corporation Smart orientation sensing circuit for remote control
US5703623A (en) * 1996-01-24 1997-12-30 Hall; Malcolm G. Smart orientation sensing circuit for remote control
WO1997039401A1 (en) * 1996-04-12 1997-10-23 Paul Milgram Finger manipulatable 6 degree-of-freedom input device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8257439B2 (en) 2004-12-22 2012-09-04 Ldr Medical Intervertebral disc prosthesis

Also Published As

Publication number Publication date
EP1250698A2 (de) 2002-10-23
WO2000063874A1 (en) 2000-10-26
AU4473000A (en) 2000-11-02

Similar Documents

Publication Publication Date Title
WO2000063874A1 (en) Human gestural input device with motion and pressure
WO2000063874A9 (en) Human gestural input device with motion and pressure
CN111356968A (zh) 基于检测到的手输入渲染虚拟手姿势
US8295549B2 (en) Peripheral device having light emitting objects for interfacing with a computer gaming system claim of priority
KR100742029B1 (ko) 휴대용 컴퓨터 대화형 장치
KR101576979B1 (ko) 자기장 센서를 이용하여 사용자 입력을 판단하는 전기 장치
JP4065035B2 (ja) 3次元カーソル位置設定装置
Taylor et al. Graspables: grasp-recognition as a user interface
US20160328028A1 (en) System, method and device for foot-operated motion and movement control in virtual reality and simulated environments
TW200414013A (en) Method and apparatus for a hybrid pointing device used with a data processing system
JP6737996B2 (ja) コンピュータ用のハンドヘルドコントローラ、コンピュータ用のコントロールシステムおよびコンピューターシステム
GB2247066A (en) Manual controller for computer graphic object display with six degrees of freedom
EP1340218A1 (de) Vom benutzer getragene elektronische schnittstelleneinrichtung
Mulder Design of virtual three-dimensional instruments for sound control
CN109559720A (zh) 电子乐器及控制方法
US20040041828A1 (en) Adaptive non-contact computer user-interface system and method
WO2009008872A1 (en) Multi-dimensional input device with center push button
US11947399B2 (en) Determining tap locations on a handheld electronic device based on inertial measurements
US20230041294A1 (en) Augmented reality (ar) pen/hand tracking
Parker Buttons, simplicity, and natural interfaces
Schmidt et al. Sensor virrig-a balance cushion as controller
Calella et al. HandMagic: Towards user interaction with inertial measuring units
Visi et al. Motion controllers, sound, and music in video games: state of the art and research perspectives
Hashimoto et al. A grasping device to sense hand gesture for expressive sound generation
Wang et al. The Pinch Sensor: An Input Device for In-Hand Manipulation with the Index Finger and Thumb

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20011116

A4 Supplementary search report drawn up and despatched

Effective date: 20020726

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

Kind code of ref document: A4

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20021011