US20060277466A1 - Bimodal user interaction with a simulated object - Google Patents
Bimodal user interaction with a simulated object Download PDFInfo
- Publication number
- US20060277466A1 US20060277466A1 US11/433,173 US43317306A US2006277466A1 US 20060277466 A1 US20060277466 A1 US 20060277466A1 US 43317306 A US43317306 A US 43317306A US 2006277466 A1 US2006277466 A1 US 2006277466A1
- Authority
- US
- United States
- Prior art keywords
- user
- simulated
- properties
- holding
- forces
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Definitions
- the present invention relates to methods and apparatuses related to user interaction with computer-simulated objects, and more specifically to methods and apparatuses related to force feedback in user interaction with different object behavior dependent on whether or not a user is holding or controlling the object.
- new interface methods are needed to fully utilize new modes of human-computer communication enabled.
- new methods of interaction can use the additional human-computer communication paths to supplement or supplant conventional communication paths, freeing up traditional keyboard input and visual feedback bandwidth.
- force feedback, or haptics can be especially useful in allowing a user to feel parts of the interface, reducing the need for a user to visually manage interface characteristics that can be managed by feel.
- Users interfacing with non-computer tasks routinely exploit the combination of visual and haptic feedback (seeing one side of a task while feeling the other); bringing this sensory combination into human-computer interfaces can make such interfaces more efficient and more intuitive for the user. Accordingly, there is a need for new methods of human-computer interfacing that make appropriate use of haptic and visual feedback.
- many contemporary computer games require the user to throw or otherwise propel an object.
- the games typically allow the user to specify a throw by pressing a button or combination of buttons.
- the timing of the button press often relative to the timing of other actions occurring in the game, controls the affect of the throw (e.g., the accuracy or distance of the throw).
- Some games display a slider or power bar that indicates direction or force of a throw; the user must press the appropriate button when the slider or bar is at the right value for the desired throw.
- the user can thereby control aspects of the throw, but not with any of the skills learned in real world throwing.
- the direction of the user's hand motion and the force applied by the user near the release of the throw generally are not significant to the throwing action in the game.
- the object being thrown is generally represented within the game independent of whether it is being held or has been released, forcing the user to adjust the control of the object to the constraints of the simulations in the game.
- the present invention can provide a method of providing user interaction with a computer representation of a simulated object, where the user can control the object in three dimensions.
- the method can provide for two distinct states: a “holding” state, and a “released” state.
- the holding state roughly corresponds to the user holding the simulated object (although other metaphors such as holding a spring attached to the object, or controlling the object at a distance can also be suitable).
- the released state roughly corresponds to the user not holding the object.
- a simple example of the two states can include the holding, then throwing of a ball.
- the method provides force feedback to the user representation of forces that the user might experience if the user were holding an actual object. The forces are not applied when in the released state.
- the present invention can allow the user to direct transitions from the holding state to the released state (e.g., releasing the ball at the end of the throwing motion), from the released state to the holding state (e.g., picking up a ball), or both.
- the present invention can also provide forces that represent both pushing and pulling the simulated object.
- the present invention can also accommodate different haptic and visual expectations of the user by providing different interaction of the object within a simulated space in the two modes. For example, a ball can be simulated with a large mass when being held by the user, to provide significant force feedback communication to the user. Upon release, however, the ball's mass internal to the simulation can be adjusted to a smaller value, to allow the ball to move and interact with other objects on a scale more fitting to the visual display capabilities.
- the present invention can also provide a method of representing a simulated object within a three dimensional computer simulation.
- the object can be represented within the simulation with a first set of properties when the user is directly controlling the object (e.g., holding the object), and with a second set of properties when the user is not directly controlling the object (e.g., after the user releases the object).
- the properties different between the two sets can comprise properties of the simulation (e.g., time or gravity forces), properties of the simulated object (e.g., mass or inertia), or a combination thereof.
- the simulation can comprise a simulation of real-world physics interactions, such as is supported by contemporary hardware accelerators, with the physics principles, the object properties, the environment properties, or a combination thereof, changed when the user initiates or terminates direct control of the object.
- the present invention can be applied to computer game applications, where the present invention can provide for enhanced user experience in propelling an object.
- the present invention can provide for a set of object and interaction forces that optimize the user experience of holding the object.
- the present invention can provide for a set of object and simulation properties that optimize the simulated object's behavior within the game environment.
- the present invention can be applied to games such as football, basketball, bowling, darts, and soccer.
- the present invention can provide a method of providing user interaction with a computer representation of a simulated object, where the user can control the object in three dimensions.
- the method can provide for two distinct states: a “holding” state, and a “released” state.
- the holding state roughly corresponds to the user holding the simulated object (although other metaphors such as holding a spring attached to the object, or controlling the object at a distance can also be suitable).
- the released state roughly corresponds to the user not holding the object.
- a simple example of the two states can include the holding, then throwing of a ball.
- the method provides force feedback to the user representation of forces that the user might experience if the user were holding an actual object. The forces are not applied when in the released state.
- the present invention can allow the user to direct transitions from the holding state to the released state (e.g., releasing the ball at the end of the throwing motion), from the released state to the holding state (e.g., picking up a ball), or both.
- the present invention can also provide forces that represent both pushing and pulling the simulated object.
- the present invention can also accommodate different haptic and visual expectations of the user by providing different interaction of the object within a simulated space in the two modes. For example, a ball can be simulated with a large mass when being held by the user, to provide significant force feedback communication to the user. Upon release, however, the ball's mass internal to the simulation can be adjusted to a smaller value, to allow the ball to move and interact with other objects on a scale more fitting to the visual display capabilities.
- the present invention can also provide a method of representing a simulated object within a three dimensional computer simulation.
- the object can be represented within the simulation with a first set of properties when the user is directly controlling the object (e.g., holding the object), and with a second set of properties when the user is not directly controlling the object (e.g., after the user releases the object).
- the properties different between the two sets can comprise properties of the simulation (e.g., time or gravity forces), properties of the simulated object (e.g., mass or inertia), or a combination thereof.
- the simulation can comprise a simulation of real-world physics interactions, such as is supported by contemporary hardware accelerators, with the physics principles, the object properties, the environment properties, or a combination thereof, changed when the user initiates or terminates direct control of the object.
- the present invention can be applied to computer game applications, where the present invention can provide for enhanced user experience in propelling an object.
- the present invention can provide for a set of object and interaction forces that optimize the user experience of holding the object.
- the present invention can provide for a set of object and simulation properties that optimize the simulated object's behavior within the game environment.
- the present invention can be applied to games such as football, basketball, bowling, darts, and soccer.
- Haptics is the field that studies the sense of touch. In computing, haptics refers to giving a User a sense of touch through a Haptic Device.
- a Haptic Device (or Device) is the mechanism that allows a User to feel virtual objects and sensations. The forces created from a Haptic Device can be controlled through motors or any other way of transferring sensations to a User.
- the position of a Device typically refers to the position of a handle on the Device that is held by User. Any of the algorithms described can vary depending on where the handle of the Device is within its workspace.
- Haptic Devices can have any number of Degrees of Freedom (DOF), and can have a different number of DOF for tracking than for forces.
- DOF Degrees of Freedom
- a Haptic Device can track 3 DOF (x, y, and z), and output forces in 3 DOF (x, y, and z), in which case the tracked DOF are the same as the forces DOF.
- a Haptic Device can track 6 DOF (x, y, and z, and rotation about x, y, and z), but only have 3 DOF (x, y, and z), in which case the tracked DOF are a superset of the forces DOF.
- any of a Device's DOF can be controlled by direct movements not relative to a fixed point in space (like a standard computer mouse), controlled by direct movements relative to a fixed point in space (like a mechanical tracker, mechanically grounded to a table it is resting on, where it can only move within a limited workspace), or controlled by forces against springs, movements around pivot points, twisting or turning a handle, or another type of limited movements (joystick, spaceball, etc).
- a User is a person utilizing a Haptic Device to play a game or utilize some other type of application that gives a sense of touch.
- the User can experience a simulation or game in ways that are consistent with a Character (described below) such that the User feels, sees, and does what the Character does.
- the User can also have any portion or all of the interactions with the simulation be separate from the Character. For example, the User's view (i.e. what is seen on the monitor) does not have to be lined up with a Character's view (i.e. what a Character would see given the environment and the location of the Character's eyes), whether the Character is currently being controlled or not.
- a Character is a person or object controlled by a User in a video game or application.
- a Character can also be a first person view and representation of the User. Characters can be simple representations described only by graphics, or they can have complex characteristics such as Inverse Kinematic equations, body mass, muscles, energy, damage levels, artificial intelligence, or can represent any type of person, animal, or object in real life in varying degrees of realism.
- a Character can be a complex system like a human, or it can simply be a simple geometric object like a marble in a marble controlling game. Characters and their information can be described and contained on a single computer, on multiple computers, and in online environments. Characters can interact with other Characters.
- a Character can be controlled by the position of a Device or a Cursor, and a Character can be any Cursor or any object.
- a Cursor is a virtual object controlled by a User controlling a Haptic Device. As the User moves the Haptic Device, the Cursor can move in some relationship to the movement of the Device. Typically, though not always, the Cursor moves proportionally to the movement of the Haptic Device along each axis (x,y,z). Those movements, however, can be scaled, rotated, or skewed or modified by any other function, and can be modified in these ways differently along different axes. For example, a Cursor can be controlled through a pivot point, where a movement of the Haptic Device to the right would move the Cursor to the left (the amount of movement can depend on a simulation of where the pivot point is located with respect to the Cursor).
- a Cursor can be a point, a sphere, any other geometric shape, a polygonal surface, a volumetric representation, a solids model, a spline based object, or can be described in any other mathematical way.
- a Cursor can also be a combination of any number of those objects.
- the graphical, haptic, and sound representations of a Cursor can be different from each other.
- a Cursor can be a perfect haptic sphere, but a polygonal graphical sphere.
- a Cursor can be controlled directly, or can be controlled through the interactions of one or more virtual objects that interact with another virtual object or other virtual objects.
- a Haptic Device can control a point that is connected to a sphere with a spring, where the sphere is used as the Cursor.
- a Cursor's movements can be constrained in the visual, haptic, audio, or other sensory domain, preventing the Cursor, or its use, from moving into a specified area. Cursor movements can be constrained by objects, algorithms, or physical stops on a Device as examples.
- the position of a Cursor and the forces created can be modified with any type of linear or non-linear transformation (for example, scaling in the x direction).
- Position can be modified through transformations, and the forces created can be modified through an inverse function to modify the perception of Cursor movements, to modify objects (such as scale, rotation, etc), or to give interesting effects to the User.
- a Cursor can have visual, haptic, and sound representations, and any properties of any of those three Cursor modalities can be different. For example, a Cursor can have different sound, graphic, and haptic representations. A Cursor does not need to be shown visually. Also, a Cursor can vary in any of the ways described above differently at different times. For example, a Cursor can have a consistent haptic and graphic position when not touching objects, but a different haptic and graphic position when objects are touched.
- a Cursor can be shown graphically when preparing to perform an action (like beginning a snowboard run, beginning a golf swing, or selecting an object), and then not shown graphically when performing the action (snowboarding, swinging a golf club, or holding an object, respectively).
- a Cursor can also be a representation of a User's interaction with a Character or an action, rather than a specific object used to touch other objects.
- a Cursor can be the object used to implement a golf swing, and control the weight and feel of the club, even though the User never actually sees the Cursor directly.
- a Cursor can change haptic and graphic characteristics as a function of time, or as a function of another variable.
- a Cursor can be any object, any Character, or control any part of a Character or object.
- a Cursor can be in the shape of a human hand, foot, or any other part or whole of a human, animal, or cartoon.
- the shape, function, and motion of a Cursor can be related to that of an equivalent or similar object in the real world.
- a Cursor shaped like a human hand can change wrist, hand and finger positioning in order to gesture, grasp objects, or otherwise interact with object similar to how hands interact with real objects.
- An interface in many games must provide the user with a method of indicating discharge of an object, for example release of a thrown ball.
- Conventional game interfaces use buttons or switches—unrelated to usual methods of releasing objects and consequently not a very realistic interface effect.
- objects are thrown by imparting sufficient momentum to them.
- a haptic interface can accommodate interaction that allows intuitive release.
- One or more force membranes can be presented to the user, where a force membrane is a region of the haptic space accessible by the user that imposes a force opposing motion toward the membrane as the user approaches the membrane.
- a membrane placed in the direction of the intended target can discourage the user from releasing the object accidentally, but can allow intentional release by application of sufficient force by the user to exceed the membrane's threshold.
- a user throwing a football The user brings the haptic interaction device back (as if to cock the arm back to throw) past a membrane, then pushes it forward (feeling the weight and drag of the football haptically), and if the user pushes the football forward fast enough to give it the required momentum, the football is thrown.
- Motion of the object in a throwing direction can be accompanied by a combination of dynamics forces and viscosity to guide the users movement. These forces can make directing the object thrown much easier.
- the forces related to the membrane can drop abruptly when the object is thrown, or can be decreased over time, depending on the desired interface characteristics.
- such a release mechanism can be used to throw balls or other objects (e.g., by pushing the object forward through a force barrier disposed between the user location and the target), to drop objects (e.g., by pushing the object downward through a force barrier between the user location and the floor), and to discharge weapons or blows (e.g., by pushing a representation of a weapon or character portion through a force barrier between the weapon or character portion and the target).
- a membrane can apply an increasing force, and the object released when the user-applied force reaches a certain relation to the membrane's force (e.g., equals the maximum force, or is double the present force). Release can also be triggered by gesture recognition: a hand moving forward rapidly, then quickly slowing, can indicate a desired throw.
- the direction of the object can be determined in various ways, some of which are discussed in more detail in U.S. provisional application 60/681,007, “Computer Interface Methods and Apparatuses,” filed May 13, 2005, incorporated herein by reference.
- the position, at release, pre-release, or both can be used to set direction;
- the object can be modeled as attached by a spring to the cursor, and the direction of throw determined from the relative positions.
- a visual indication can be combined with the haptic information to indicate status of the release; for example, an image of an arm about to throw can be displayed in connection with the image of the ball when the ball is ready to be thrown (pulled through the first membrane in the previous example).
- a haptic cue can also be used, for example a vibration in the device or perceptible bump in its motion.
- Bimodal interface can provide for two separate modes of object simulation, optionally selectable responsive to direction from the user.
- a first mode the user is directly controlling a simulated object. This can correspond to, as examples, the user holding the simulated object, the user holding a spring or other intermediate structure that in turn holds the object, or the user controlling the object at a distance (e.g., according to a magical levitation spell or a space-type tractor beam).
- a second mode the user is not directly controlling the simulated object. This can correspond to, as examples, the user having released a thrown object, the user dropping an object, some interaction with the simulated environment causing the user to lose control of the object, or the distance control being disrupted (e.g., the spell is broken).
- the object can be represented to the user and to the simulation with different interaction properties in the two states, where “interaction properties” that can be different are aspects of the user's perception of the object that change upon transition between the modes, including as examples the properties of the object in the simulation; the properties of the object as perceived by the user (e.g., texture, size); the rate of determining object characteristics (e.g., position or deformation); simulation properties (e.g., time scales, physics models); environment properties (e.g., gravitational constants, relative size of objects); and execution of a corresponding software program on a different thread or a different processor.
- the transition between the states can be at any point where the user associates the state transition with the release of the object, including simultaneous with the release of an object by the user or before or after such release.
- “Holding” mode In the first mode, the user can generally be allowed to hold the object, manipulate the object, and interact with the object and, through the object, with the simulated environment. After an object is touched or selected, a User can determine that the object should be held. This can be accomplished automatically (for example, an object can be automatically grabbed, when it is touched) or the object can be held or grabbed based on a User input such as a button press, a switch press, a voice input, a gesture that is recognized, or some other type of User input. When an object is held, it can have dynamics properties such as weight, inertia, momentum, or any other physical property. It can be implemented as a weight, and a User can feel the weight and reaction forces as it is moved.
- dynamics properties such as weight, inertia, momentum, or any other physical property. It can be implemented as a weight, and a User can feel the weight and reaction forces as it is moved.
- Objects that are held can have interactions with the environment. For example, a heavy object might need to be dragged across a virtual ground if it is too heavy to be picked up by the Character. In this case, as in real life, the User can feel the weight of the object, but that weight can be less than if the object was not touching the ground. As it is dragged across the ground, forces can be applied to the Cursor or object representing the friction or resistance of the object as it moves across the ground, bumps into things, or snags or gets caught on other virtual objects.
- Objects that are held can also have forces applied to them (or directly to the Cursor) based on other virtual objects that interact with the object that is held. For example, a User might feed a virtual animal. As an apple is picked up the User might feel the weight of the apple, then, when a virtual horse bites the apple, the User can feel the apple being pulled and pushed, and the weight can be adjusted to reflect the material removed by the bite. Objects can be rotated while they are held. To rotate an object, a User can rotate the handle of the Device, or can modify the position of the Device so that the rotation of the object occurs based on a change in position of the Cursor.
- Objects that are held can have other haptic characteristics.
- a User could hold an object that is spinning and feel the resulting forces.
- a virtual gyroscope could create directional forces that the User would feel.
- a User can feel the acceleration of an object being held. For example, if a User holds a virtual firehose, the User might feel the force pushing back from the direction that firehose is spraying based on how much water is coming out, how close another virtual object is that is being sprayed, how tightly the hose is being held, how much pressure there is in the spraying, or other aspects of the simulation of the water being sprayed. If an object has its own forces, based on its representation, the User could feel them.
- a User could feel the popcorn popping within it.
- Each individual force of a popcorn popping could be relayed to the User, or the forces could be represented through some other representation (such as a random approximation of forces that a popcorn popper would create).
- the forces that a User feels can have any form or representation which can represent any action or event.
- a User might hold a virtual wand to cast spells. The User can feel any type of force through the wand, which can represent any type of spell.
- the User can feel interactions of a virtual held object with another object that hits the object. For example, a User might feel the weight of a baseball bat, and a path constraint of how it can move. Then when the bat hits a baseball, that feeling can be felt by the User through a force applied to the bat or applied directly to the Cursor. If a User is holding onto a matador cape, and a virtual bull runs through the cape, the User would feel the pull of the cape against the Cursor as the bull pulls on the cape.
- the User might also feel the force adjusted if the cape were to rip, the feeling of the cape being pulled out of the Character's hands, the cape getting caught in the bull's horns and being pulled harder, or any other interaction with the object being held.
- a User might hold onto an object that is controlling the Character's movements. For example, the Character might grab onto the bumper of a moving car. Then, the User would feel the pull of the bumper relative to the movement of the Character. The feeling of pull could also be combined with other interactions such as maintaining balance.
- a User can feel an object interact with another object that is caught by the object. For example the User can hold a net and catch a fish. The User can feel the forces of the fish flopping within the net either directly through forces applied to the Cursor, or through forces applied to the net through the fish's actions (and therefore applied to the Cursor, through its interaction with the net).
- a User can control a held object to interact with an environment. For example, A User could hold onto a fishing pole and swing it forward to cast a line out into a lake. Then a User might feel a tugging force representing a fish nibbling at the hook. The User might pull on the fishing pole to hook the fish, and then feel the weight of the fish along the line added to the weight of the pole as the fish is reeled in. While an object is held, it can be shaken. A User can feel the changing inertia of an object that is moved back and forth at different rates. A virtual object might have liquid inside it that can be felt by a User through a force representation that varies based on how the liquid moves within the object. A User might feel an object slipping in his hands.
- the User might feel a force as the rope is pulled. Then the User could feel a lesser force if the rope slips through his hands until the rope is grabbed tighter, at which point the rope stops sliding and the User feels a larger force again.
- the User might also feel force variations representing the texture of an object that is sliding through his hands. An object that is held can be swung as well. The User can feel the weight of an object, and feel the force of the object change based on the relative positions of the Cursor (to which the object is attached) and the object.
- “Released” mode In the second mode, the User is no longer directly controlling or manipulating the object. The object can then interact with the remainder of the simulated environment as any other simulated object.
- forces and motions that are most effective for efficient and intuitive User control or an object are not the same as forces and motions that are most suitable for the computer simulation.
- the physical constraints of the range of motion of an input device can be larger than the actual display, but smaller than the total simulated space. Accordingly, motions of the User to control an object that are directly mapped to motion in the simulated space can be ill-suited for efficient User interaction.
- forces applicable by the user and time over which such forces are applied can be different in interacting with a haptic input device than forces and times in the computer simulation.
- a user of a haptic input device might have only a few inches of motion to simulate shooting a basketball; while the simulation might represent a more complete range of travel for the shooting action.
- the basketball experienced by the user might be more effective if the mass represented by the force feedback to the haptic device is larger than that in the simulation (moving a larger basketball mass a shorter distance, in this simplified example).
- the time variable in a simulation can be different when the user is holding an object than when the object is released.
- Sophisticated user interaction experiences can be implemented by changing such parameters (e.g., mass, time, scale).
- the user can be provided with a sense of imparting energy to an object by applying a relatively large force; changing the time variable in the simulation when the user releases the object can interface the user's control of the object with the simulated environment in a manner that provides a unique user experience. Even if the properties of the simulated object are unchanged, the rate at which the object's position, velocity, or other characteristics can be different in the two modes.
- the object's position and force interactions can be updated at a rate amenable to haptics interaction (e.g., 1000 Hz).
- haptics interaction e.g. 1000 Hz
- the object's position can be updated at a rate amenable to the simulation or to the display characteristics desired (e.g., 60 Hz).
- the behavior of an object is simulated by determining acceleration (a) by dividing the vector sum of forces determined to be acting on the object by the mass assigned to the object.
- the current velocity (V) of the object can be determined by adding the previously determined velocity (Vo) to the acceleration (a) times the time step (t).
- changing the effective time (t) on transition between modes can allow effective user interaction in some applications.
- the separation of the two modes can be especially beneficial in a game or other simulation that relies on physics-based simulations of objects within the game.
- the physics simulation sometimes accelerated with special purpose hardware, is often designed for efficient implementation of the desired game characteristics. These are often not directly translatable to characteristics desired for efficient and intuitive haptic interactions.
- the human-computer interface can optimize the haptic interaction; e.g., by presenting to the user's touch an object with mass, inertia, or other properties that give the user intuitive haptic control of the object. These properties can be effectively decoupled from the underlying physics simulation, so that the haptic interaction and physics simulation can be separately optimized.
- the present invention can be especially useful in games where the user has experience or expectations in real world analogs to the game activity. For example, many users have experience throwing a football or shooting a basketball.
- the present invention can allow the haptic control of the throwing or shooting motion to be designed such that the user experiences forces and motions building on the real world experience, while the underlying game simulation can proceed at its slower iteration rate and with its game-based physics properties.
- Another example is a baseball game, where the speed and direction of motion of a haptic device, caused by the user in opposition to resistive forces, can be designed to provide a realistic throwing interaction.
- the change from holding to released mode can be an occasion for acceleration, deceleration, error correction or amplification, or other changes suitable to the change in modes.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/433,173 US20060277466A1 (en) | 2005-05-13 | 2006-05-13 | Bimodal user interaction with a simulated object |
PCT/US2006/042557 WO2007133251A2 (fr) | 2006-05-13 | 2006-10-30 | Interaction bimodale d'un utilisateur avec un objet virtuel |
US12/783,386 US9804672B2 (en) | 2005-05-13 | 2010-05-19 | Human-computer user interaction |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US68100705P | 2005-05-13 | 2005-05-13 | |
US11/433,173 US20060277466A1 (en) | 2005-05-13 | 2006-05-13 | Bimodal user interaction with a simulated object |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/783,386 Continuation US9804672B2 (en) | 2005-05-13 | 2010-05-19 | Human-computer user interaction |
Publications (1)
Publication Number | Publication Date |
---|---|
US20060277466A1 true US20060277466A1 (en) | 2006-12-07 |
Family
ID=38694353
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/433,173 Abandoned US20060277466A1 (en) | 2005-05-13 | 2006-05-13 | Bimodal user interaction with a simulated object |
US12/783,386 Active 2028-01-16 US9804672B2 (en) | 2005-05-13 | 2010-05-19 | Human-computer user interaction |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/783,386 Active 2028-01-16 US9804672B2 (en) | 2005-05-13 | 2010-05-19 | Human-computer user interaction |
Country Status (2)
Country | Link |
---|---|
US (2) | US20060277466A1 (fr) |
WO (1) | WO2007133251A2 (fr) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090315839A1 (en) * | 2008-06-24 | 2009-12-24 | Microsoft Corporation | Physics simulation-based interaction for surface computing |
US20100148999A1 (en) * | 2008-12-16 | 2010-06-17 | Casparian Mark A | Keyboard with user configurable granularity scales for pressure sensitive keys |
US20110095877A1 (en) * | 2008-12-16 | 2011-04-28 | Casparian Mark A | Apparatus and methods for mounting haptics actuation circuitry in keyboards |
US20110102326A1 (en) * | 2008-12-16 | 2011-05-05 | Casparian Mark A | Systems and methods for implementing haptics for pressure sensitive keyboards |
US8005656B1 (en) * | 2008-02-06 | 2011-08-23 | Ankory Ran | Apparatus and method for evaluation of design |
US8700829B2 (en) | 2011-09-14 | 2014-04-15 | Dell Products, Lp | Systems and methods for implementing a multi-function mode for pressure sensitive sensors and keyboards |
US20140104320A1 (en) * | 2012-10-17 | 2014-04-17 | Perceptive Pixel, Inc. | Controlling Virtual Objects |
US8711011B2 (en) | 2008-12-16 | 2014-04-29 | Dell Products, Lp | Systems and methods for implementing pressure sensitive keyboards |
US20150169174A1 (en) * | 2013-12-18 | 2015-06-18 | Dassault Systemes DELMIA Corp. | Posture Creation With Tool Pickup |
US9111005B1 (en) | 2014-03-13 | 2015-08-18 | Dell Products Lp | Systems and methods for configuring and controlling variable pressure and variable displacement sensor operations for information handling systems |
US9343248B2 (en) | 2013-08-29 | 2016-05-17 | Dell Products Lp | Systems and methods for implementing spring loaded mechanical key switches with variable displacement sensing |
US9368300B2 (en) | 2013-08-29 | 2016-06-14 | Dell Products Lp | Systems and methods for lighting spring loaded mechanical key switches |
US10025099B2 (en) | 2015-06-10 | 2018-07-17 | Microsoft Technology Licensing, Llc | Adjusted location hologram display |
US20180229078A1 (en) * | 2011-08-29 | 2018-08-16 | Icuemotion Llc | Inertial sensor motion tracking and stroke analysis system |
US10118696B1 (en) | 2016-03-31 | 2018-11-06 | Steven M. Hoffberg | Steerable rotating projectile |
CN108885523A (zh) * | 2016-03-31 | 2018-11-23 | 索尼公司 | 信息处理设备、显示控制方法和程序 |
US10668353B2 (en) | 2014-08-11 | 2020-06-02 | Icuemotion Llc | Codification and cueing system for sport and vocational activities |
US10854104B2 (en) | 2015-08-28 | 2020-12-01 | Icuemotion Llc | System for movement skill analysis and skill augmentation and cueing |
CN113420031A (zh) * | 2021-06-30 | 2021-09-21 | 中国航空油料有限责任公司 | 航油数据发布系统及航油数据发布方法 |
WO2022048403A1 (fr) * | 2020-09-01 | 2022-03-10 | 魔珐(上海)信息科技有限公司 | Procédé, appareil et système d'interaction multimodale sur la base de rôle virtuel, support de stockage et terminal |
CN114433506A (zh) * | 2022-04-11 | 2022-05-06 | 北京霍里思特科技有限公司 | 一种分选机 |
WO2023092229A1 (fr) * | 2021-11-26 | 2023-06-01 | Lululemon Athletica Canada Inc. | Procédé et système pour fournir des expériences d'interaction numériques multisensorielles |
US11712637B1 (en) | 2018-03-23 | 2023-08-01 | Steven M. Hoffberg | Steerable disk or ball |
Families Citing this family (174)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7834855B2 (en) | 2004-08-25 | 2010-11-16 | Apple Inc. | Wide touchpad on a portable computer |
KR100556503B1 (ko) * | 2002-11-26 | 2006-03-03 | 엘지전자 주식회사 | 건조기의 건조 시간제어 방법 |
US20080125224A1 (en) * | 2006-09-26 | 2008-05-29 | Pollatsek David | Method and apparatus for controlling simulated in flight realistic and non realistic object effects by sensing rotation of a hand-held controller |
US10697996B2 (en) * | 2006-09-26 | 2020-06-30 | Nintendo Co., Ltd. | Accelerometer sensing and object control |
US20080207331A1 (en) * | 2007-02-26 | 2008-08-28 | Theodore Beale | Artificial player character for massive multi-player on-line game |
US8326442B2 (en) * | 2007-05-25 | 2012-12-04 | International Business Machines Corporation | Constrained navigation in a three-dimensional (3D) virtual arena |
JP5014898B2 (ja) * | 2007-06-29 | 2012-08-29 | Thk株式会社 | ドライブシミュレータ用ステアリング及びドライブシミュレータ |
US11264139B2 (en) | 2007-11-21 | 2022-03-01 | Edda Technology, Inc. | Method and system for adjusting interactive 3D treatment zone for percutaneous treatment |
US10431001B2 (en) * | 2007-11-21 | 2019-10-01 | Edda Technology, Inc. | Method and system for interactive percutaneous pre-operation surgical planning |
US20090174679A1 (en) | 2008-01-04 | 2009-07-09 | Wayne Carl Westerman | Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface |
CN102186544B (zh) * | 2008-01-17 | 2014-05-14 | 维沃克斯公司 | 用于在采用每个具象的渲染环境的虚拟现实系统中提供实时的每个具象的流数据的可扩展技术 |
JP2009194597A (ja) * | 2008-02-14 | 2009-08-27 | Sony Corp | 送受信システム、送信装置、送信方法、受信装置、受信方法、提示装置、提示方法、プログラム、及び記録媒体 |
US8350843B2 (en) * | 2008-03-13 | 2013-01-08 | International Business Machines Corporation | Virtual hand: a new 3-D haptic interface and system for virtual environments |
US8203529B2 (en) * | 2008-03-13 | 2012-06-19 | International Business Machines Corporation | Tactile input/output device and system to represent and manipulate computer-generated surfaces |
JP2009271909A (ja) * | 2008-04-08 | 2009-11-19 | Canon Inc | 図形描画編集システム、図形描画編集装置及び図形描画編集方法 |
US8749495B2 (en) * | 2008-09-24 | 2014-06-10 | Immersion Corporation | Multiple actuation handheld device |
US8957835B2 (en) | 2008-09-30 | 2015-02-17 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US8310447B2 (en) * | 2008-11-24 | 2012-11-13 | Lsi Corporation | Pointing device housed in a writing device |
US8294047B2 (en) * | 2008-12-08 | 2012-10-23 | Apple Inc. | Selective input signal rejection and modification |
KR101531363B1 (ko) * | 2008-12-10 | 2015-07-06 | 삼성전자주식회사 | 이차원 인터랙티브 디스플레이에서 가상 객체 또는 시점을 제어하는 방법 |
US9489131B2 (en) * | 2009-02-05 | 2016-11-08 | Apple Inc. | Method of presenting a web page for accessibility browsing |
US8364314B2 (en) * | 2009-04-30 | 2013-01-29 | GM Global Technology Operations LLC | Method and apparatus for automatic control of a humanoid robot |
US20100306825A1 (en) * | 2009-05-27 | 2010-12-02 | Lucid Ventures, Inc. | System and method for facilitating user interaction with a simulated object associated with a physical location |
US8610744B2 (en) * | 2009-07-10 | 2013-12-17 | Adobe Systems Incorporated | Methods and apparatus for natural media painting using proximity-based tablet stylus gestures |
US9696842B2 (en) * | 2009-10-06 | 2017-07-04 | Cherif Algreatly | Three-dimensional cube touchscreen with database |
US20110096036A1 (en) * | 2009-10-23 | 2011-04-28 | Mcintosh Jason | Method and device for an acoustic sensor switch |
JP5269745B2 (ja) * | 2009-10-30 | 2013-08-21 | 任天堂株式会社 | オブジェクト制御プログラム、オブジェクト制御装置、オブジェクト制御システム及びオブジェクト制御方法 |
US9678508B2 (en) * | 2009-11-16 | 2017-06-13 | Flanders Electric Motor Service, Inc. | Systems and methods for controlling positions and orientations of autonomous vehicles |
US9256951B2 (en) * | 2009-12-10 | 2016-02-09 | Koninklijke Philips N.V. | System for rapid and accurate quantitative assessment of traumatic brain injury |
EP2339576B1 (fr) * | 2009-12-23 | 2019-08-07 | Google LLC | Entrée multimodale sur un dispositif électronique |
US11416214B2 (en) * | 2009-12-23 | 2022-08-16 | Google Llc | Multi-modal input on an electronic device |
WO2011100220A1 (fr) | 2010-02-09 | 2011-08-18 | The Trustees Of The University Of Pennsylvania | Systèmes et procédés permettant de fournir une rétroaction de vibration dans des systèmes robotiques |
US8683367B1 (en) * | 2010-02-24 | 2014-03-25 | The Boeing Company | Method and system for differentiating among a plurality of design expressions |
JP5087101B2 (ja) * | 2010-03-31 | 2012-11-28 | 株式会社バンダイナムコゲームス | プログラム、情報記憶媒体及び画像生成システム |
KR101640043B1 (ko) * | 2010-04-14 | 2016-07-15 | 삼성전자주식회사 | 가상 세계 처리 장치 및 방법 |
US9557814B2 (en) * | 2010-04-22 | 2017-01-31 | Sony Interactive Entertainment Inc. | Biometric interface for a handheld device |
US10922870B2 (en) * | 2010-06-01 | 2021-02-16 | Vladimir Vaganov | 3D digital painting |
US10217264B2 (en) * | 2010-06-01 | 2019-02-26 | Vladimir Vaganov | 3D digital painting |
US8966400B2 (en) * | 2010-06-07 | 2015-02-24 | Empire Technology Development Llc | User movement interpretation in computer generated reality |
US9274641B2 (en) * | 2010-07-08 | 2016-03-01 | Disney Enterprises, Inc. | Game pieces for use with touch screen devices and related methods |
US20120007808A1 (en) | 2010-07-08 | 2012-01-12 | Disney Enterprises, Inc. | Interactive game pieces using touch screen devices for toy play |
US8988445B2 (en) * | 2010-07-30 | 2015-03-24 | The Trustees Of The University Of Pennsylvania | Systems and methods for capturing and recreating the feel of surfaces |
US8282454B2 (en) | 2010-09-29 | 2012-10-09 | Nintendo Co., Ltd. | Video game systems and methods including moving the protected character based with the movement of unprotected game character(s) |
US9119655B2 (en) | 2012-08-03 | 2015-09-01 | Stryker Corporation | Surgical manipulator capable of controlling a surgical instrument in multiple modes |
US10281915B2 (en) | 2011-01-05 | 2019-05-07 | Sphero, Inc. | Multi-purposed self-propelled device |
US9114838B2 (en) | 2011-01-05 | 2015-08-25 | Sphero, Inc. | Self-propelled device for interpreting input from a controller device |
US9429940B2 (en) | 2011-01-05 | 2016-08-30 | Sphero, Inc. | Self propelled device with magnetic coupling |
US9090214B2 (en) | 2011-01-05 | 2015-07-28 | Orbotix, Inc. | Magnetically coupled accessory for a self-propelled device |
US9218316B2 (en) | 2011-01-05 | 2015-12-22 | Sphero, Inc. | Remotely controlling a self-propelled device in a virtualized environment |
WO2012103241A1 (fr) * | 2011-01-28 | 2012-08-02 | Yair Greenberg | Article produisant une réponse de contact et de déplacement guidés et procédé |
US9990856B2 (en) | 2011-02-08 | 2018-06-05 | The Trustees Of The University Of Pennsylvania | Systems and methods for providing vibration feedback in robotic systems |
WO2012125596A2 (fr) | 2011-03-12 | 2012-09-20 | Parshionikar Uday | Dispositif de commande universel pour dispositifs électroniques, gestion d'expressions de visage et détection de somnolence |
US20120244969A1 (en) | 2011-03-25 | 2012-09-27 | May Patents Ltd. | System and Method for a Motion Sensing Device |
TWI515603B (zh) * | 2011-03-28 | 2016-01-01 | 緯創資通股份有限公司 | 觸控反饋裝置、觸控反饋方法及觸控顯示裝置 |
JP5498437B2 (ja) * | 2011-05-25 | 2014-05-21 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置、情報処理方法、情報処理プログラム、情報処理プログラムを記憶したコンピュータ読み取り可能な記録媒体、厚み領域設定装置、厚み領域設定方法、厚み領域設定プログラム、厚み領域設定プログラムを記憶したコンピュータ読み取り可能な記録媒体、仮想空間における面に関するデータ構造 |
FR2976700B1 (fr) * | 2011-06-17 | 2013-07-12 | Inst Nat Rech Inf Automat | Procede de generation d'ordres de commande de coordination d'organes de deplacement d'une plateforme animee et generateur correspondant. |
US9498720B2 (en) * | 2011-09-30 | 2016-11-22 | Microsoft Technology Licensing, Llc | Sharing games using personal audio/visual apparatus |
US9248372B2 (en) * | 2011-10-05 | 2016-02-02 | Wargaming.Net Llp | Using and exporting experience gained in a video game |
US9289685B2 (en) * | 2011-11-18 | 2016-03-22 | Verizon Patent And Licensing Inc. | Method and system for providing virtual throwing of objects |
KR102054370B1 (ko) * | 2011-11-23 | 2019-12-12 | 삼성전자주식회사 | 햅틱 피드백 방법 및 장치, 기계로 읽을 수 있는 저장 매체 및 휴대용 통신 단말 |
US20130198625A1 (en) * | 2012-01-26 | 2013-08-01 | Thomas G Anderson | System For Generating Haptic Feedback and Receiving User Inputs |
KR101754318B1 (ko) | 2012-02-06 | 2017-07-06 | 핫헤드 게임즈 인크. | 가상 경쟁 그룹 관리 시스템 및 방법 |
US9195369B2 (en) * | 2012-02-06 | 2015-11-24 | Hothead Games, Inc. | Virtual opening of boxes and packs of cards |
CN102662567B (zh) * | 2012-03-23 | 2016-01-13 | 腾讯科技(深圳)有限公司 | 触发网页内操作的方法及设备 |
US8698746B1 (en) * | 2012-04-24 | 2014-04-15 | Google Inc. | Automatic calibration curves for a pointing device |
US9183676B2 (en) * | 2012-04-27 | 2015-11-10 | Microsoft Technology Licensing, Llc | Displaying a collision between real and virtual objects |
US20130293580A1 (en) | 2012-05-01 | 2013-11-07 | Zambala Lllp | System and method for selecting targets in an augmented reality environment |
WO2013168413A1 (fr) * | 2012-05-08 | 2013-11-14 | 株式会社カプコン | Programme de jeu, dispositif de jeu et système de jeu |
US9292758B2 (en) | 2012-05-14 | 2016-03-22 | Sphero, Inc. | Augmentation of elements in data content |
US9827487B2 (en) | 2012-05-14 | 2017-11-28 | Sphero, Inc. | Interactive augmented reality using a self-propelled device |
WO2013173389A1 (fr) | 2012-05-14 | 2013-11-21 | Orbotix, Inc. | Fonctionnement d'un dispositif informatique par détection d'objets arrondis dans une image |
EP2856282A4 (fr) * | 2012-05-31 | 2015-12-02 | Nokia Technologies Oy | Appareil d'affichage |
US20140002336A1 (en) * | 2012-06-27 | 2014-01-02 | Greg D. Kaine | Peripheral device for visual and/or tactile feedback |
KR101398086B1 (ko) * | 2012-07-06 | 2014-05-30 | (주)위메이드엔터테인먼트 | 온라인 게임에서의 유저 제스처 입력 처리 방법 |
WO2014009561A2 (fr) * | 2012-07-13 | 2014-01-16 | Softkinetic Software | Procédé et système pour interactions homme-machine gestuelles simultanées à l'aide de points d'intérêt singuliers sur une main |
US10056791B2 (en) | 2012-07-13 | 2018-08-21 | Sphero, Inc. | Self-optimizing power transfer |
US8902159B1 (en) | 2012-07-24 | 2014-12-02 | John Matthews | Ergonomic support apparatus having situational sensory augmentation |
CN112932672A (zh) * | 2012-08-03 | 2021-06-11 | 史赛克公司 | 用于机器人外科手术的系统和方法 |
US9226796B2 (en) | 2012-08-03 | 2016-01-05 | Stryker Corporation | Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path |
US8831840B2 (en) | 2012-08-31 | 2014-09-09 | United Video Properties, Inc. | Methods and systems for producing the environmental conditions of a media asset in a vehicle |
US9621602B2 (en) * | 2012-11-27 | 2017-04-11 | Facebook, Inc. | Identifying and providing physical social actions to a social networking system |
US10346025B2 (en) * | 2013-02-05 | 2019-07-09 | Microsoft Technology Licensing, Llc | Friction field for fluid margin panning in a webpage |
US20180046265A1 (en) * | 2013-06-06 | 2018-02-15 | Idhl Holdings, Inc. | Latency Masking Systems and Methods |
US20160310753A1 (en) * | 2013-06-21 | 2016-10-27 | Brian Bravo | Body tuner system |
US9101838B2 (en) | 2013-07-26 | 2015-08-11 | David J. Dascher | Dual pivot game controller |
US20150041554A1 (en) * | 2013-08-06 | 2015-02-12 | Watergush, Inc. | Social media fountain |
GB201315228D0 (en) * | 2013-08-27 | 2013-10-09 | Univ London Queen Mary | Control methods for expressive musical performance from a keyboard or key-board-like interface |
US9411796B2 (en) * | 2013-09-04 | 2016-08-09 | Adobe Systems Incorporated | Smoothing paths in a graphical interface generated by drawing inputs |
CN103455623B (zh) * | 2013-09-12 | 2017-02-15 | 广东电子工业研究院有限公司 | 一种融合多种语言文献的聚类机制 |
US20150077345A1 (en) * | 2013-09-16 | 2015-03-19 | Microsoft Corporation | Simultaneous Hover and Touch Interface |
US9208765B1 (en) | 2013-09-18 | 2015-12-08 | American Megatrends, Inc. | Audio visual presentation with three-dimensional display devices |
US9411511B1 (en) * | 2013-09-19 | 2016-08-09 | American Megatrends, Inc. | Three-dimensional display devices with out-of-screen virtual keyboards |
US9335831B2 (en) * | 2013-10-14 | 2016-05-10 | Adaptable Keys A/S | Computer keyboard including a control unit and a keyboard screen |
US8825492B1 (en) * | 2013-10-28 | 2014-09-02 | Yousef A. E. S. M. Buhadi | Language-based video game |
US10168873B1 (en) | 2013-10-29 | 2019-01-01 | Leap Motion, Inc. | Virtual interactions for machine control |
US9996797B1 (en) | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Interactions with virtual objects for machine control |
JP5888309B2 (ja) * | 2013-10-31 | 2016-03-22 | カシオ計算機株式会社 | トレーニング支援装置およびシステム、フォーム解析装置および方法、ならびにプログラム |
US10416834B1 (en) | 2013-11-15 | 2019-09-17 | Leap Motion, Inc. | Interaction strength using virtual objects for machine control |
US9829882B2 (en) | 2013-12-20 | 2017-11-28 | Sphero, Inc. | Self-propelled device with center of mass drive system |
EP2891950B1 (fr) | 2014-01-07 | 2018-08-15 | Sony Depthsensing Solutions | Procédé de navigation homme-machine à base de gestes de la main tridimensionnels naturels |
EP3100972A4 (fr) * | 2014-01-27 | 2017-10-18 | Volvo Construction Equipment AB | Commande de stabilisateurs et de bouteurs utilisant un gui |
JP2015166890A (ja) * | 2014-03-03 | 2015-09-24 | ソニー株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
CN104134226B (zh) * | 2014-03-12 | 2015-08-19 | 腾讯科技(深圳)有限公司 | 一种虚拟场景中的声音模拟方法、装置及客户端设备 |
US11900734B2 (en) | 2014-06-02 | 2024-02-13 | Accesso Technology Group Plc | Queuing system |
GB201409764D0 (en) | 2014-06-02 | 2014-07-16 | Accesso Technology Group Plc | Queuing system |
US9919208B2 (en) * | 2014-12-11 | 2018-03-20 | Immersion Corporation | Video gameplay haptics |
US10768704B2 (en) | 2015-03-17 | 2020-09-08 | Whirlwind VR, Inc. | System and method for modulating a peripheral device based on an unscripted feed using computer vision |
US9707680B1 (en) | 2015-05-28 | 2017-07-18 | X Development Llc | Suggesting, selecting, and applying task-level movement parameters to implementation of robot motion primitives |
US10101157B2 (en) | 2015-09-14 | 2018-10-16 | Eric Bharucha | Free-space force feedback system |
US9846971B2 (en) | 2016-01-19 | 2017-12-19 | Disney Enterprises, Inc. | Systems and methods for augmenting an appearance of a hilt to simulate a bladed weapon |
US11351472B2 (en) | 2016-01-19 | 2022-06-07 | Disney Enterprises, Inc. | Systems and methods for using a gyroscope to change the resistance of moving a virtual weapon |
US9971408B2 (en) | 2016-01-27 | 2018-05-15 | Ebay Inc. | Simulating touch in a virtual environment |
US11663783B2 (en) | 2016-02-10 | 2023-05-30 | Disney Enterprises, Inc. | Systems and methods for using augmented reality with the internet of things |
AU2017227708A1 (en) | 2016-03-01 | 2018-10-18 | ARIS MD, Inc. | Systems and methods for rendering immersive environments |
US10587834B2 (en) | 2016-03-07 | 2020-03-10 | Disney Enterprises, Inc. | Systems and methods for tracking objects for augmented reality |
JP2017170586A (ja) * | 2016-03-25 | 2017-09-28 | セイコーエプソン株式会社 | エンドエフェクター、ロボット、およびロボット制御装置 |
JP6382884B2 (ja) * | 2016-04-26 | 2018-08-29 | 日本電信電話株式会社 | 推定装置、推定方法、およびプログラム |
US9919213B2 (en) | 2016-05-03 | 2018-03-20 | Hothead Games Inc. | Zoom controls for virtual environment user interfaces |
US10905956B2 (en) | 2016-06-28 | 2021-02-02 | Rec Room Inc. | Systems and methods providing temporary decoupling of user avatar synchronicity for presence enhancing experiences |
US10004991B2 (en) | 2016-06-28 | 2018-06-26 | Hothead Games Inc. | Systems and methods for customized camera views in virtualized environments |
US10010791B2 (en) | 2016-06-28 | 2018-07-03 | Hothead Games Inc. | Systems and methods for customized camera views and customizable objects in virtualized environments |
US20180046352A1 (en) * | 2016-08-09 | 2018-02-15 | Matthew Johnson | Virtual cursor movement |
USD852209S1 (en) * | 2016-08-24 | 2019-06-25 | Beijing Kingsoft Internet Security Software Co., Ltd. | Mobile communication terminal with animated graphical user interface |
USD852210S1 (en) * | 2016-08-24 | 2019-06-25 | Beijing Kingsoft Internet Security Software Co., Ltd. | Mobile communication terminal with graphical user interface |
CN106446452B (zh) * | 2016-10-19 | 2019-03-29 | 南京信息工程大学 | 基于力触觉交互的拱梁弹簧形变模型的建模方法 |
US10493363B2 (en) * | 2016-11-09 | 2019-12-03 | Activision Publishing, Inc. | Reality-based video game elements |
US11205103B2 (en) | 2016-12-09 | 2021-12-21 | The Research Foundation for the State University | Semisupervised autoencoder for sentiment analysis |
KR20190110539A (ko) * | 2016-12-13 | 2019-09-30 | 딥모션, 인크. | 솔버용의 다수의 힘 어레이를 사용한 향상된 가상 현실 시스템 |
US10732714B2 (en) | 2017-05-08 | 2020-08-04 | Cirrus Logic, Inc. | Integrated haptic system |
CN110869879A (zh) * | 2017-06-30 | 2020-03-06 | 雷蛇(亚太)私人有限公司 | 利用力传感器和触觉致动器的可调节触觉反馈 |
CN107330858B (zh) * | 2017-06-30 | 2020-12-04 | 北京乐蜜科技有限责任公司 | 一种图片处理方法、装置、电子设备及存储介质 |
US10375930B1 (en) | 2017-07-07 | 2019-08-13 | Chad R. James | Animal training device that controls stimulus using proportional pressure-based input |
US11259121B2 (en) | 2017-07-21 | 2022-02-22 | Cirrus Logic, Inc. | Surface speaker |
US10481680B2 (en) | 2018-02-02 | 2019-11-19 | Disney Enterprises, Inc. | Systems and methods to provide a shared augmented reality experience |
US20190294249A1 (en) * | 2018-03-21 | 2019-09-26 | JANUS Research Group, Inc. | Systems and methods for haptic feedback in a virtual reality system |
US10546431B2 (en) | 2018-03-29 | 2020-01-28 | Disney Enterprises, Inc. | Systems and methods to augment an appearance of physical object for an augmented reality experience |
US10832537B2 (en) | 2018-04-04 | 2020-11-10 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer |
US10726201B1 (en) | 2018-06-29 | 2020-07-28 | Microsoft Technology Licensing, Llc | Creating and handling lambda functions in spreadsheet applications |
US10699068B2 (en) | 2018-06-29 | 2020-06-30 | Microsoft Technology Licensing, Llc | Distribution of lambda functions |
US11023669B2 (en) | 2018-06-29 | 2021-06-01 | Microsoft Technology Licensing, Llc | Rendering lambda functions in spreadsheet applications |
US11423116B2 (en) * | 2018-06-29 | 2022-08-23 | Microsoft Technology Licensing, Llc | Automatically creating lambda functions in spreadsheet applications |
WO2020026301A1 (fr) * | 2018-07-30 | 2020-02-06 | 株式会社ソニー・インタラクティブエンタテインメント | Dispositif de jeu et procédé de commande de jeu de golf |
US11269415B2 (en) | 2018-08-14 | 2022-03-08 | Cirrus Logic, Inc. | Haptic output systems |
US11312015B2 (en) * | 2018-09-10 | 2022-04-26 | Reliabotics LLC | System and method for controlling the contact pressure applied by an articulated robotic arm to a working surface |
US10974132B2 (en) | 2018-10-02 | 2021-04-13 | Disney Enterprises, Inc. | Systems and methods to provide a shared interactive experience across multiple presentation devices based on detection of one or more extraterrestrial bodies |
US11307730B2 (en) * | 2018-10-19 | 2022-04-19 | Wen-Chieh Geoffrey Lee | Pervasive 3D graphical user interface configured for machine learning |
GB201817495D0 (en) | 2018-10-26 | 2018-12-12 | Cirrus Logic Int Semiconductor Ltd | A force sensing system and method |
US11014008B2 (en) | 2019-03-27 | 2021-05-25 | Disney Enterprises, Inc. | Systems and methods for game profile development based on virtual and/or real activities |
CN109955225A (zh) * | 2019-03-28 | 2019-07-02 | 东南大学 | 一种并联式三自由度力反馈手控器及其控制方法 |
US10992297B2 (en) | 2019-03-29 | 2021-04-27 | Cirrus Logic, Inc. | Device comprising force sensors |
US11644370B2 (en) | 2019-03-29 | 2023-05-09 | Cirrus Logic, Inc. | Force sensing with an electromagnetic load |
US10828672B2 (en) | 2019-03-29 | 2020-11-10 | Cirrus Logic, Inc. | Driver circuitry |
US20200313529A1 (en) | 2019-03-29 | 2020-10-01 | Cirrus Logic International Semiconductor Ltd. | Methods and systems for estimating transducer parameters |
US10955955B2 (en) | 2019-03-29 | 2021-03-23 | Cirrus Logic, Inc. | Controller for use in a device comprising force sensors |
US11509292B2 (en) | 2019-03-29 | 2022-11-22 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter |
US10726683B1 (en) | 2019-03-29 | 2020-07-28 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using a two-tone stimulus |
US11004247B2 (en) | 2019-04-02 | 2021-05-11 | Adobe Inc. | Path-constrained drawing with visual properties based on drawing tool |
DK180359B1 (en) | 2019-04-15 | 2021-02-03 | Apple Inc | Accelerated scrolling and selection |
US10916061B2 (en) | 2019-04-24 | 2021-02-09 | Disney Enterprises, Inc. | Systems and methods to synchronize real-world motion of physical objects with presentation of virtual content |
CA3044587C (fr) * | 2019-05-28 | 2023-04-25 | Square Enix Ltd. | Controle du personnage joueur avec fonction de mouvement amelioree |
US10976825B2 (en) | 2019-06-07 | 2021-04-13 | Cirrus Logic, Inc. | Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system |
CN114008569A (zh) | 2019-06-21 | 2022-02-01 | 思睿逻辑国际半导体有限公司 | 用于在装置上配置多个虚拟按钮的方法和设备 |
US11216150B2 (en) | 2019-06-28 | 2022-01-04 | Wen-Chieh Geoffrey Lee | Pervasive 3D graphical user interface with vector field functionality |
US10918949B2 (en) | 2019-07-01 | 2021-02-16 | Disney Enterprises, Inc. | Systems and methods to provide a sports-based interactive experience |
TWI716964B (zh) * | 2019-08-09 | 2021-01-21 | 天下數位科技股份有限公司 | 洗牌機之輔助壓牌構造 |
US11408787B2 (en) | 2019-10-15 | 2022-08-09 | Cirrus Logic, Inc. | Control methods for a force sensor system |
CN110794385B (zh) * | 2019-10-18 | 2021-07-13 | 北京空间机电研究所 | 一种激光器零重力指向的评估方法及系统 |
US11380175B2 (en) | 2019-10-24 | 2022-07-05 | Cirrus Logic, Inc. | Reproducibility of haptic waveform |
US11545951B2 (en) | 2019-12-06 | 2023-01-03 | Cirrus Logic, Inc. | Methods and systems for detecting and managing amplifier instability |
US11662821B2 (en) | 2020-04-16 | 2023-05-30 | Cirrus Logic, Inc. | In-situ monitoring, calibration, and testing of a haptic actuator |
US11933822B2 (en) | 2021-06-16 | 2024-03-19 | Cirrus Logic Inc. | Methods and systems for in-system estimation of actuator parameters |
US11765499B2 (en) | 2021-06-22 | 2023-09-19 | Cirrus Logic Inc. | Methods and systems for managing mixed mode electromechanical actuator drive |
US11908310B2 (en) | 2021-06-22 | 2024-02-20 | Cirrus Logic Inc. | Methods and systems for detecting and managing unexpected spectral content in an amplifier system |
US11552649B1 (en) | 2021-12-03 | 2023-01-10 | Cirrus Logic, Inc. | Analog-to-digital converter-embedded fixed-phase variable gain amplifier stages for dual monitoring paths |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5741182A (en) * | 1994-06-17 | 1998-04-21 | Sports Sciences, Inc. | Sensing spatial movement |
US5833549A (en) * | 1995-11-14 | 1998-11-10 | Interactive Light, Inc. | Sports trainer and game |
US5846086A (en) * | 1994-07-01 | 1998-12-08 | Massachusetts Institute Of Technology | System for human trajectory learning in virtual environments |
US6033227A (en) * | 1996-02-26 | 2000-03-07 | Nec Corporation | Training apparatus |
US6343987B2 (en) * | 1996-11-07 | 2002-02-05 | Kabushiki Kaisha Sega Enterprises | Image processing device, image processing method and recording medium |
US20020021277A1 (en) * | 2000-04-17 | 2002-02-21 | Kramer James F. | Interface for controlling a graphical image |
US20020034980A1 (en) * | 2000-08-25 | 2002-03-21 | Thomas Lemmons | Interactive game via set top boxes |
US6366272B1 (en) * | 1995-12-01 | 2002-04-02 | Immersion Corporation | Providing interactions between simulated objects using force feedback |
US6402154B1 (en) * | 2000-12-15 | 2002-06-11 | Michael Hess | Simulated football game |
US20040053686A1 (en) * | 2002-09-12 | 2004-03-18 | Pacey Larry J. | Gaming machine performing real-time 3D rendering of gaming events |
US6712692B2 (en) * | 2002-01-03 | 2004-03-30 | International Business Machines Corporation | Using existing videogames for physical training and rehabilitation |
US6722888B1 (en) * | 1995-01-20 | 2004-04-20 | Vincent J. Macri | Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment |
US20040219980A1 (en) * | 2003-04-30 | 2004-11-04 | Nintendo Co., Ltd. | Method and apparatus for dynamically controlling camera parameters based on game play events |
US20050250083A1 (en) * | 1997-10-06 | 2005-11-10 | Macri Vincent J | Method and apparatus for instructors to develop pre-training lessons using controllable images |
US20070060337A1 (en) * | 2005-08-19 | 2007-03-15 | Aruze Corp. | Game program and game system |
US20070134639A1 (en) * | 2005-12-13 | 2007-06-14 | Jason Sada | Simulation process with user-defined factors for interactive user training |
US7259761B2 (en) * | 1998-07-17 | 2007-08-21 | Sensable Technologies, Inc. | Systems and methods for sculpting virtual objects in a haptic virtual reality environment |
US7544137B2 (en) * | 2003-07-30 | 2009-06-09 | Richardson Todd E | Sports simulation system |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4302190A (en) * | 1979-12-19 | 1981-11-24 | The United States Of America As Represented By The Secretary Of The Navy | Rifle recoil simulator |
US4380437A (en) * | 1981-09-04 | 1983-04-19 | Yarborough Jr G Wirth | Small weapons simulator |
US4909260A (en) * | 1987-12-03 | 1990-03-20 | American Health Products, Inc. | Portable belt monitor of physiological functions and sensors therefor |
US5232223A (en) * | 1992-03-24 | 1993-08-03 | Larry Dornbusch | Electronic game controller |
US5403192A (en) * | 1993-05-10 | 1995-04-04 | Cae-Link Corporation | Simulated human lung for anesthesiology simulation |
US5734373A (en) * | 1993-07-16 | 1998-03-31 | Immersion Human Interface Corporation | Method and apparatus for controlling force feedback interface systems utilizing a host computer |
US5542672A (en) | 1995-03-17 | 1996-08-06 | Meredith; Chris | Fishing rod and reel electronic game controller |
US5825308A (en) * | 1996-11-26 | 1998-10-20 | Immersion Human Interface Corporation | Force feedback interface having isotonic and isometric functionality |
US6155926A (en) * | 1995-11-22 | 2000-12-05 | Nintendo Co., Ltd. | Video game system and method with enhanced three-dimensional character and background control |
US6147674A (en) * | 1995-12-01 | 2000-11-14 | Immersion Corporation | Method and apparatus for designing force sensations in force feedback computer applications |
US6219032B1 (en) * | 1995-12-01 | 2001-04-17 | Immersion Corporation | Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface |
US6374255B1 (en) * | 1996-05-21 | 2002-04-16 | Immersion Corporation | Haptic authoring |
US5921780A (en) * | 1996-06-28 | 1999-07-13 | Myers; Nicole J. | Racecar simulator and driver training system and method |
IL119463A (en) * | 1996-10-21 | 2000-08-31 | Kwalwasser Yaakov | Recoil simulator for a weapon |
JP3882287B2 (ja) * | 1997-03-07 | 2007-02-14 | 株式会社セガ | 魚釣り遊戯装置 |
US6421048B1 (en) * | 1998-07-17 | 2002-07-16 | Sensable Technologies, Inc. | Systems and methods for interacting with virtual objects in a haptic virtual reality environment |
US6695770B1 (en) * | 1999-04-01 | 2004-02-24 | Dominic Kin Leung Choy | Simulated human interaction systems |
US20040259644A1 (en) * | 1999-08-03 | 2004-12-23 | Mccauley Jack Jean | Method and device for optical gun interaction with a computer system |
US20020010021A1 (en) * | 1999-08-03 | 2002-01-24 | Mccauley Jack Jean | Method and device for optical gun interaction with a computer game system |
US6592461B1 (en) * | 2000-02-04 | 2003-07-15 | Roni Raviv | Multifunctional computer interactive play system |
US6761637B2 (en) * | 2000-02-22 | 2004-07-13 | Creative Kingdoms, Llc | Method of game play using RFID tracking device |
US7445550B2 (en) * | 2000-02-22 | 2008-11-04 | Creative Kingdoms, Llc | Magical wand and interactive play experience |
US6554706B2 (en) * | 2000-05-31 | 2003-04-29 | Gerard Jounghyun Kim | Methods and apparatus of displaying and evaluating motion data in a motion game apparatus |
US20020119822A1 (en) * | 2001-02-28 | 2002-08-29 | Kunzle Adrian E. | Systems and methods wherein a player device continues game play independent of a determination of player input validity |
US7202851B2 (en) | 2001-05-04 | 2007-04-10 | Immersion Medical Inc. | Haptic interface for palpation simulation |
JP3442754B2 (ja) * | 2001-08-10 | 2003-09-02 | 株式会社コナミコンピュータエンタテインメント東京 | ガンシューティングゲーム装置、コンピュータの制御方法及びプログラム |
US6817979B2 (en) * | 2002-06-28 | 2004-11-16 | Nokia Corporation | System and method for interacting with a user's virtual physiological model via a mobile terminal |
AU2003275265A1 (en) * | 2002-09-26 | 2004-04-19 | Robert Levine | Medical instruction using a virtual patient |
US20040180719A1 (en) * | 2002-12-04 | 2004-09-16 | Philip Feldman | Game controller support structure and isometric exercise system and method of facilitating user exercise during game interaction |
US20040211104A1 (en) * | 2003-04-28 | 2004-10-28 | Eberle Glen Richard | Universal modular gunstock |
US8992322B2 (en) * | 2003-06-09 | 2015-03-31 | Immersion Corporation | Interactive gaming systems with haptic feedback |
JP3931889B2 (ja) * | 2003-08-19 | 2007-06-20 | ソニー株式会社 | 画像表示システム、画像表示装置、画像表示方法 |
US20050191601A1 (en) * | 2004-02-26 | 2005-09-01 | Vojtech Dvorak | Training weapon |
US7806759B2 (en) * | 2004-05-14 | 2010-10-05 | Konami Digital Entertainment, Inc. | In-game interface with performance feedback |
WO2006020846A2 (fr) * | 2004-08-11 | 2006-02-23 | THE GOVERNMENT OF THE UNITED STATES OF AMERICA as represented by THE SECRETARY OF THE NAVY Naval Research Laboratory | Procede de locomotion simulee, et appareil correspondant |
-
2006
- 2006-05-13 US US11/433,173 patent/US20060277466A1/en not_active Abandoned
- 2006-10-30 WO PCT/US2006/042557 patent/WO2007133251A2/fr active Application Filing
-
2010
- 2010-05-19 US US12/783,386 patent/US9804672B2/en active Active
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5741182A (en) * | 1994-06-17 | 1998-04-21 | Sports Sciences, Inc. | Sensing spatial movement |
US5846086A (en) * | 1994-07-01 | 1998-12-08 | Massachusetts Institute Of Technology | System for human trajectory learning in virtual environments |
US6966778B2 (en) * | 1995-01-20 | 2005-11-22 | Vincent J. Macri | Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment |
US6722888B1 (en) * | 1995-01-20 | 2004-04-20 | Vincent J. Macri | Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment |
US5833549A (en) * | 1995-11-14 | 1998-11-10 | Interactive Light, Inc. | Sports trainer and game |
US6366272B1 (en) * | 1995-12-01 | 2002-04-02 | Immersion Corporation | Providing interactions between simulated objects using force feedback |
US7158112B2 (en) * | 1995-12-01 | 2007-01-02 | Immersion Corporation | Interactions between simulated objects with force feedback |
US20060030383A1 (en) * | 1995-12-01 | 2006-02-09 | Rosenberg Louis B | Force feedback device for simulating combat |
US6033227A (en) * | 1996-02-26 | 2000-03-07 | Nec Corporation | Training apparatus |
US6343987B2 (en) * | 1996-11-07 | 2002-02-05 | Kabushiki Kaisha Sega Enterprises | Image processing device, image processing method and recording medium |
US20050250083A1 (en) * | 1997-10-06 | 2005-11-10 | Macri Vincent J | Method and apparatus for instructors to develop pre-training lessons using controllable images |
US7259761B2 (en) * | 1998-07-17 | 2007-08-21 | Sensable Technologies, Inc. | Systems and methods for sculpting virtual objects in a haptic virtual reality environment |
US20020021277A1 (en) * | 2000-04-17 | 2002-02-21 | Kramer James F. | Interface for controlling a graphical image |
US20020034980A1 (en) * | 2000-08-25 | 2002-03-21 | Thomas Lemmons | Interactive game via set top boxes |
US6402154B1 (en) * | 2000-12-15 | 2002-06-11 | Michael Hess | Simulated football game |
US6712692B2 (en) * | 2002-01-03 | 2004-03-30 | International Business Machines Corporation | Using existing videogames for physical training and rehabilitation |
US20040053686A1 (en) * | 2002-09-12 | 2004-03-18 | Pacey Larry J. | Gaming machine performing real-time 3D rendering of gaming events |
US20040219980A1 (en) * | 2003-04-30 | 2004-11-04 | Nintendo Co., Ltd. | Method and apparatus for dynamically controlling camera parameters based on game play events |
US7544137B2 (en) * | 2003-07-30 | 2009-06-09 | Richardson Todd E | Sports simulation system |
US20070060337A1 (en) * | 2005-08-19 | 2007-03-15 | Aruze Corp. | Game program and game system |
US20070134639A1 (en) * | 2005-12-13 | 2007-06-14 | Jason Sada | Simulation process with user-defined factors for interactive user training |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8005656B1 (en) * | 2008-02-06 | 2011-08-23 | Ankory Ran | Apparatus and method for evaluation of design |
WO2010008680A3 (fr) * | 2008-06-24 | 2010-03-11 | Microsoft Corporation | Interaction basée sur une simulation physique destinée au calcul de surface |
US8502795B2 (en) | 2008-06-24 | 2013-08-06 | Microsoft Corporation | Physics simulation-based interaction for surface computing |
US20090315839A1 (en) * | 2008-06-24 | 2009-12-24 | Microsoft Corporation | Physics simulation-based interaction for surface computing |
US8154524B2 (en) | 2008-06-24 | 2012-04-10 | Microsoft Corporation | Physics simulation-based interaction for surface computing |
CN102132253A (zh) * | 2008-06-24 | 2011-07-20 | 微软公司 | 用于表面计算的基于物理学仿真的交互 |
US20110095877A1 (en) * | 2008-12-16 | 2011-04-28 | Casparian Mark A | Apparatus and methods for mounting haptics actuation circuitry in keyboards |
US20110102326A1 (en) * | 2008-12-16 | 2011-05-05 | Casparian Mark A | Systems and methods for implementing haptics for pressure sensitive keyboards |
US20100148999A1 (en) * | 2008-12-16 | 2010-06-17 | Casparian Mark A | Keyboard with user configurable granularity scales for pressure sensitive keys |
US8674941B2 (en) | 2008-12-16 | 2014-03-18 | Dell Products, Lp | Systems and methods for implementing haptics for pressure sensitive keyboards |
US9342149B2 (en) | 2008-12-16 | 2016-05-17 | Dell Products Lp | Systems and methods for implementing haptics for pressure sensitive keyboards |
US9791941B2 (en) | 2008-12-16 | 2017-10-17 | Dell Products Lp | Keyboard with user configurable granularity scales for pressure sensitive keys |
US8711011B2 (en) | 2008-12-16 | 2014-04-29 | Dell Products, Lp | Systems and methods for implementing pressure sensitive keyboards |
US8760273B2 (en) | 2008-12-16 | 2014-06-24 | Dell Products, Lp | Apparatus and methods for mounting haptics actuation circuitry in keyboards |
US9246487B2 (en) | 2008-12-16 | 2016-01-26 | Dell Products Lp | Keyboard with user configurable granularity scales for pressure sensitive keys |
US20180229078A1 (en) * | 2011-08-29 | 2018-08-16 | Icuemotion Llc | Inertial sensor motion tracking and stroke analysis system |
US10610732B2 (en) * | 2011-08-29 | 2020-04-07 | Icuemotion Llc | Inertial sensor motion tracking and stroke analysis system |
US8700829B2 (en) | 2011-09-14 | 2014-04-15 | Dell Products, Lp | Systems and methods for implementing a multi-function mode for pressure sensitive sensors and keyboards |
US20140104320A1 (en) * | 2012-10-17 | 2014-04-17 | Perceptive Pixel, Inc. | Controlling Virtual Objects |
US9589538B2 (en) * | 2012-10-17 | 2017-03-07 | Perceptive Pixel, Inc. | Controlling virtual objects |
US9368300B2 (en) | 2013-08-29 | 2016-06-14 | Dell Products Lp | Systems and methods for lighting spring loaded mechanical key switches |
US9959996B2 (en) | 2013-08-29 | 2018-05-01 | Dell Products Lp | Systems and methods for lighting spring loaded mechanical key switches |
US9343248B2 (en) | 2013-08-29 | 2016-05-17 | Dell Products Lp | Systems and methods for implementing spring loaded mechanical key switches with variable displacement sensing |
US9256348B2 (en) * | 2013-12-18 | 2016-02-09 | Dassault Systemes Americas Corp. | Posture creation with tool pickup |
US20150169174A1 (en) * | 2013-12-18 | 2015-06-18 | Dassault Systemes DELMIA Corp. | Posture Creation With Tool Pickup |
US9111005B1 (en) | 2014-03-13 | 2015-08-18 | Dell Products Lp | Systems and methods for configuring and controlling variable pressure and variable displacement sensor operations for information handling systems |
US10668353B2 (en) | 2014-08-11 | 2020-06-02 | Icuemotion Llc | Codification and cueing system for sport and vocational activities |
US11455834B2 (en) | 2014-08-11 | 2022-09-27 | Icuemotion Llc | Codification and cueing system for sport and vocational activities |
US10025099B2 (en) | 2015-06-10 | 2018-07-17 | Microsoft Technology Licensing, Llc | Adjusted location hologram display |
US11367364B2 (en) | 2015-08-28 | 2022-06-21 | Icuemotion Llc | Systems and methods for movement skill analysis and skill augmentation |
US11763697B2 (en) | 2015-08-28 | 2023-09-19 | Icuemotion Llc | User interface system for movement skill analysis and skill augmentation |
US10854104B2 (en) | 2015-08-28 | 2020-12-01 | Icuemotion Llc | System for movement skill analysis and skill augmentation and cueing |
US10118696B1 (en) | 2016-03-31 | 2018-11-06 | Steven M. Hoffberg | Steerable rotating projectile |
US11301031B2 (en) | 2016-03-31 | 2022-04-12 | Sony Corporation | Information processing apparatus and display control method to control a display position of a virtual object |
US11230375B1 (en) | 2016-03-31 | 2022-01-25 | Steven M. Hoffberg | Steerable rotating projectile |
CN108885523A (zh) * | 2016-03-31 | 2018-11-23 | 索尼公司 | 信息处理设备、显示控制方法和程序 |
US11712637B1 (en) | 2018-03-23 | 2023-08-01 | Steven M. Hoffberg | Steerable disk or ball |
WO2022048403A1 (fr) * | 2020-09-01 | 2022-03-10 | 魔珐(上海)信息科技有限公司 | Procédé, appareil et système d'interaction multimodale sur la base de rôle virtuel, support de stockage et terminal |
CN113420031A (zh) * | 2021-06-30 | 2021-09-21 | 中国航空油料有限责任公司 | 航油数据发布系统及航油数据发布方法 |
WO2023092229A1 (fr) * | 2021-11-26 | 2023-06-01 | Lululemon Athletica Canada Inc. | Procédé et système pour fournir des expériences d'interaction numériques multisensorielles |
CN114433506A (zh) * | 2022-04-11 | 2022-05-06 | 北京霍里思特科技有限公司 | 一种分选机 |
Also Published As
Publication number | Publication date |
---|---|
WO2007133251A2 (fr) | 2007-11-22 |
US9804672B2 (en) | 2017-10-31 |
WO2007133251A3 (fr) | 2009-05-22 |
US20100261526A1 (en) | 2010-10-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20060277466A1 (en) | Bimodal user interaction with a simulated object | |
CA2254854C (fr) | Procede et appareil pour produire des interactons physiques simulees dans des environnements generes par ordinateur | |
Andrews et al. | Hapticast: a physically-based 3D game with haptic feedback | |
US20040130525A1 (en) | Dynamic touch screen amusement game controller | |
CN110382064A (zh) | 用于使用控制装置的传感器来控制游戏的方法和系统 | |
US10328339B2 (en) | Input controller and corresponding game mechanics for virtual reality systems | |
US7682250B2 (en) | Method and apparatus for simulating interactive spinning bar gymnastics on a 3D display | |
CN105764582B (zh) | 游戏装置、游戏系统、程序以及记录介质 | |
JP5124553B2 (ja) | ゲームプログラム、ゲーム装置、ゲーム制御方法 | |
KR20090003337A (ko) | 가상 장비 모델의 자동 적합화 방법 | |
WO2007032122A1 (fr) | Programme de commande de jeu, machine de jeu et méthode de commande de jeu | |
US20060199626A1 (en) | In-game shot aiming indicator | |
JP5258710B2 (ja) | ゲームプログラム、ゲーム装置、ゲーム制御方法 | |
JP5736407B2 (ja) | ゲームプログラム及びゲーム装置 | |
Bozgeyikli et al. | Tangiball: Dynamic embodied tangible interaction with a ball in virtual reality | |
Hemmingsen | Movement compression, sports and eSports | |
TW201503938A (zh) | 360度環場虛擬實境釣魚系統 | |
JP3835477B2 (ja) | ゲームを実行制御するプログラム及び、このプログラムを実行するゲーム装置 | |
JP6863678B2 (ja) | プログラム及びゲーム装置 | |
Richard et al. | Modeling dynamic interaction in virtual environments and the evaluation of dynamic virtual fixtures | |
JP2014144360A (ja) | ゲームプログラム及びゲーム装置 | |
US20040113931A1 (en) | Human-computer interfaces incorporating haptics and path-based interaction | |
Pitura | Sword fighting in virtual reality: Where are we and how do we make it real | |
WO2009015500A1 (fr) | Procédé et dispositif pour commander une séquence de mouvement au cours d'une partie ou d'événement sportif simulé | |
JP2020189138A (ja) | プログラム及びゲーム装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NOVINT TECHNOLOGIES, INC., NEW MEXICO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ANDERSON, THOMAS G;REEL/FRAME:018544/0947 Effective date: 20061115 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |