US20030080987A1 - Methods and apparatus for providing haptic feedback in interacting with virtual pets - Google Patents
Methods and apparatus for providing haptic feedback in interacting with virtual pets Download PDFInfo
- Publication number
- US20030080987A1 US20030080987A1 US10/283,258 US28325802A US2003080987A1 US 20030080987 A1 US20030080987 A1 US 20030080987A1 US 28325802 A US28325802 A US 28325802A US 2003080987 A1 US2003080987 A1 US 2003080987A1
- Authority
- US
- United States
- Prior art keywords
- user
- code
- virtual pet
- haptic effect
- sensation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 22
- 230000000694 effects Effects 0.000 claims abstract description 58
- 230000035807 sensation Effects 0.000 claims description 53
- 230000000007 visual effect Effects 0.000 claims description 15
- 230000036541 health Effects 0.000 claims description 12
- 230000002996 emotional effect Effects 0.000 claims description 6
- 241000699670 Mus sp. Species 0.000 claims 1
- 230000003993 interaction Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 5
- 230000000875 corresponding effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 230000003155 kinesthetic effect Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 230000001055 chewing effect Effects 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 206010003399 Arthropod bite Diseases 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- 206010020466 Hunger Diseases 0.000 description 1
- 206010037180 Psychiatric symptoms Diseases 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000035622 drinking Effects 0.000 description 1
- 230000008846 dynamic interplay Effects 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000036449 good health Effects 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000006748 scratching Methods 0.000 description 1
- 230000002393 scratching effect Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
- A63F13/285—Generating tactile feedback signals via the game input device, e.g. force feedback
-
- A63F13/10—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06G—ANALOGUE COMPUTERS
- G06G7/00—Devices in which the computing operation is performed by varying electric or magnetic quantities
- G06G7/48—Analogue computers for specific processes, systems or devices, e.g. simulators
-
- A63F13/12—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/825—Fostering virtual characters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/54—Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1037—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/204—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/50—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
- A63F2300/53—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing
- A63F2300/538—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing for performing operations on behalf of the game client, e.g. rendering
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6063—Methods for processing data by generating or executing the game program for sound processing
- A63F2300/6081—Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/65—Methods for processing data by generating or executing the game program for computing the condition of a game character
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8058—Virtual breeding, e.g. tamagotchi
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
Definitions
- This invention relates generally to haptic systems, and more particularly, to interactive simulations and interface devices that incorporate haptic feedback.
- Embodiments of the invention relate to methods and systems for providing haptic feedback to a user interacting with a simulated (or “virtual”) pet, so as to enhance the realism of the user's relationship with the virtual pet.
- virtual pet as used herein is construed broadly to refer to any simulated creature or character, which may or may not have a “real-life” counterpart.
- a method of providing haptic feedback to a user interacting with a virtual pet comprises: receiving a signal relating to a biological status of the virtual pet, and outputting, to the user, a haptic effect based on the received signal.
- FIG. 1 illustrates a flowchart depicting an embodiment of a method of the invention
- FIG. 2 shows a block diagram of an embodiment of a haptic system of the invention
- FIG. 3 depicts a block diagram of an alternative embodiment of a haptic system of the invention
- FIG. 4 shows a block diagram of an embodiment of a haptic feedback assembly of the invention
- FIG. 5 illustrates an embodiment of a single purring waveform
- FIG. 6 shows an embodiment of a continuous purring waveform
- FIG. 7 depicts an embodiment of a “healthy” heartbeat waveform
- FIG. 8 shows an embodiment of a “weakened-health” heartbeat waveform
- FIG. 9 illustrates an embodiment of a “near-death” heartbeat waveform
- FIG. 10 depicts an embodiment of an “excited” heartbeat waveform
- FIG. 11 shows an embodiment of a giggling sensation waveform.
- a method of providing haptic feedback to a user interacting with a virtual pet comprises: receiving a signal relating to a biological status of the virtual pet, and outputting, to the user, a haptic effect based on the received signal.
- biological status is construed broadly to refer to a “state of being” of a virtual pet, such as for example a health or emotional state.
- examples of the biological status include, but are not limited to: heartbeat, vitality, purring, giggling, being affectionate, and other personal traits.
- Such states of being are conveyed to a user by way of haptic effects generated based on the biological status of the virtual pet. The user may also experience responses related to feeding and other interactions with the virtual pet by way of appropriate haptic effects.
- the software application for controlling a virtual pet may be located on a local device (e.g., a computer or a hand-held device), where the signal relating to the biological status and associated haptic effect are determined at the local device.
- the software application for controlling a virtual pet may reside remotely, e.g., on a network resource, where the signal relating to the biological status along with associated haptic effect may be generated within the network and sent to a local device for interaction with the user.
- the controller executes the software so as to practice the above method, and the haptic feedback assembly is configured to output the haptic effect thus generated on the user-interface object.
- the haptic system further comprises a display screen for displaying a visual image of the virtual pet. It may additionally include an audio element for providing an audio cue associated with the biological status of the virtual pet. Such visual and audio effects may be produced and coordinated in a manner that complements the haptic sensation experienced by the user.
- the haptic system described above may be embodied in a computer, a cell phone, a personal digital assistant (PDA), a pager, a game console, a stand-alone toy device (e.g., Tomagotcchi), or other types of hand-held electronic devices known in the art, which may be further equipped with network capabilities.
- PDA personal digital assistant
- a pager e.g., a pager
- game console e.g., a game console
- a stand-alone toy device e.g., Tomagotcchi
- other types of hand-held electronic devices known in the art, which may be further equipped with network capabilities.
- FIG. 1 shows a flowchart 100 depicting a method of providing haptic feedback to a user interacting with a virtual pet, according to an embodiment of the invention. It will be appreciated that the embodiment of FIG. 1 is provided by way of example to illustrate the principles of the invention, and should not be construed as limiting the scope of the invention in any manner. One skilled in the art would also recognize that various changes and modifications can be made herein, without departing from the principles and scope of the invention.
- the flowchart 100 of FIG. 1 comprises receiving a signal relating to a biological status of a virtual pet, as recited in step 110 ; and outputting, to a user, a haptic effect based on the received signal, as recited in step 120 .
- the term “receiving” is defined broadly to refer to receiving a signal relating to a biological status of the virtual pet from (or within) a local device; or receiving a signal relating to a biological status of the virtual pet from an outside (or “remote”) source, such as a network resource.
- the former pertains to a situation where the software application for controlling the virtual pet is located on a local device (such as a computer or a hand-held device), as described in further detail in FIG. 2 below. In this situation, the signal relating to the biological signal can be received, for example, by an actuator from a controller located within the local device.
- the latter pertains to a situation where the software application for controlling the virtual pet is remotely located on a network resource, where information related to the virtual pet is transmitted to a local device in contact with the user, as further depicted in FIG. 3.
- the signal relating to the biological signal can be received from the network, for example, by an actuator located within the local device.
- biological status refers to a “state of being” (or behavior) of the virtual pet, such as a health or emotional state.
- biological status include, but are not limited to: heartbeat, vitality, purring, giggling, being affectionate, and other personal traits.
- a haptic effect is generated based on the received signal relating to the biological status of the virtual pet, and output to the user.
- the determination of the haptic effect may likewise be performed with a local device (such as a computer or a hand-held device).
- the determination of the haptic effect may also be performed within a network resource coupled to the local device; a signal or indication based on the determination of the haptic effect can be transmitted to the local device, which can output it to the user.
- the haptic effect thus generated serves to convey to the user a tactile or kinesthetic feedback associated with the biological state, hence enhancing the realism of the user-pet interaction.
- the user may also experience responses related to feeding and other interactions with the pet by way of appropriate haptic effects thus generated.
- haptic effect should be construed broadly as encompassing any type of force feedback, such as tactile or kinesthetic feedback, that is deemed appropriate for conveying a particular biological status of the virtual pet and thereby enhancing the realism of the user-pet interaction. See FIG. 4 for further detail.
- the embodiment of FIG. 1 may further comprise displaying a virtual image of the virtual pet, as recited in step 130 . It may also include generating an audio cue associated with the biological status of the virtual pet, as recited in step 140 . Such visual and audio effects may be coordinated such to complement the haptic sensation experienced by the user.
- the embodiment of FIG. 1 may additionally include modifying/updating the biological status of the virtual pet, as recited in step 150 .
- the user may take upon action (e.g., touching or feeding the pet), which alters the biological status of the pet (e.g., purring or giggling).
- FIG. 2 depicts a block diagram of a haptic system 200 , which may be utilized to provide haptic feedback to a user interacting with a virtual pet, according to an embodiment of the invention.
- the haptic system 200 may include a user-interface object 210 , a haptic feedback assembly 220 , a local controller 230 , and memory 240 storing computer-executable software to be executed by the controller 230 .
- the haptic feedback assembly 220 is configured to provide haptic feedback to the user-interface object 210 .
- the haptic feedback assembly 220 may be mechanically integrated with the user-interface object 210 to form a “haptic-enabled” unitary device 250 , as described in further detail with respect to FIG. 4.
- the haptic feedback assembly 220 can be mechanically engaged with the user-interface object 210 in a manner that effectively transmits the force feedback.
- the haptic feedback assembly 220 and the user-interface object 210 are further in communication with the controller 230 , via for example a wired or wireless communication means known in the art.
- the computer-executable software stored in the memory 240 causes the local controller 230 to perform tasks when executing the software. More specifically, the computer-executable software causes the local controller 230 to receive an indicator or signal associated with a biological status of the virtual pet, which may be prompted by an input signal from the user-interface object 210 . The computer-executable software further causes the local controller 230 to generate an indicator or signal associated with a haptic effect based on the received indicator or signal associated with the biological status. The generated indicator or signal associated with the haptic effect causes the haptic feedback assembly 220 to output the haptic effect to the user. The biological status and/or the corresponding haptic effect may be selected, for example, from a database (e.g., stored in the memory 240 ), or generated in a dynamic manner.
- a database e.g., stored in the memory 240
- the haptic system 200 of FIG. 2 can optionally include a display screen 260 , in communication with the controller 230 , for displaying a visual image of the virtual pet.
- the haptic system 200 can optionally include an audio element 270 , in communication with the controller 230 , for providing an audio cue associated with the biological status of the virtue pet.
- the software for generating such visual and/or audio signals may be stored in the memory 240 and executable by the controller 230 .
- the visual, audio, and haptic effects as described may be produced and coordinated by the controller 230 in a manner that best enhances the realism of the user's interaction with the virtual pet.
- FIG. 3 depicts a haptic system 300 pertaining to this scenario, according to an embodiment of the invention.
- a network resource 330 may be utilized, in lieu (or in conjunction with) the local controller 230 and the memory 240 in the embodiment of FIG. 2.
- the network resource 330 may include for example a suitable controller and software executable by the controller (along with an appropriate database).
- Network resource 330 can operate in a manner similar to those described above with respect to FIG. 2.
- the network resource 330 may determine a biological status of the virtual pet, which may be prompted for example by an input signal from the user-interface object 210 .
- the network resource 330 can generate an indicator or signal associated with the determined biological status.
- the network resource 330 may further generate an indicator or signal associated with a haptic effect based on the biological status thus determined.
- the network resource 330 can send the indicator or signal associated with the haptic effect,to the haptic feedback assembly 220 to output the haptic effect to the user-interface object 210 .
- the network resource 330 may also provide an indicator or signal associated with a visual image of the virtual pet. Such an indicator or signal associated with a visual image of the virtual pet can be transmitted and displayed on the display screen 260 .
- the network resource 330 may generate an audio signal associated with the biological status of the virtual pet and transmit the audio signal to the audio element 270 . As described above with respect to FIG. 2, such visual and audio cues may be coordinated in a manner that best complements the haptic sensation experienced by the user.
- the haptic system 200 of FIG. 2 may be embodied for example in a personal computer (such as desktop or laptop), a work station, a kiosk, or one of a variety of home video game console systems commonly connected to a television set or other display screen.
- the user-interface object 210 may be for example a mouse, joystick, keyboard, touchpad, direction pad, gamdpad, trackball, remote control, or other types of user-interface input devices known in the art.
- the user may interact with the virtual pet (e.g., touching) by way of manipulating a cursor on the display screen 260 (e.g., a monitor), for instance.
- the memory 240 includes, but is not limited to: random access memory (RAM), read-only memory (ROM), hard drives, DVD drives, CD-R/RW drive, floppy diskettes, photomagnetoelectric disks, magnetic tapes, or other data storage medium known in the art.
- the controller 230 e.g., one or more processors in a computer
- the haptic system 200 of FIG. 2 may also be embodied in a hand-held device, such as a cell phone, PDA, pager, a self-contained electronic toy such as “Tomagotchi”, a handheld video game unit (e.g., Nintendo Gameboy), and the like.
- the user-interface object may be provided by one or more physical (or soft) keys, scrollwheels, switches, or other types of user-interface input devices.
- a touch screen may be employed to serve as both a user-interface input device and a display means.
- FIG. 4 shows a block diagram of an exemplary embodiment 420 of a haptic feedback assembly, which may be used to configure the haptic feedback assembly 220 of FIG. 2 (or 3 ).
- the haptic feedback assembly 420 may generally include a processor 421 , one or more sensors 422 along with associated sensor interface 423 for detecting the motion of a user-interface object 410 , and one or more actuators 424 along with associated actuator interface 425 for outputting forces on the a user-interface object 410 .
- the processor 421 may use the output from the sensors to control the actuators, so as to exert appropriate forces on the user-interface object 410 in accordance with its motion.
- the configuration and operation of such a haptic feedback assembly are described in greater detail in U.S. Pat. Nos. 5,734,373, 6,285,351, and 6,300,936, which are incorporated herein by reference.
- the haptic feedback assembly 420 and the user-interface object 410 may be mechanically integrated to form a “haptic-enabled” unitary device, such as the iFeel mouse manufactured by Logitech, Inc., and enabled by the TouchSenseTM technology of Immersion Corporation.
- a mouse may be interfaced to a computer running a virtual pet software (e.g., an Internet-based virtual pet software from Neopets.com).
- a virtual pet software e.g., an Internet-based virtual pet software from Neopets.com.
- Such software enables users to create their own pets, which may be selected from many different types and with a wide variety of characteristics.
- U.S. Pat. Nos. 6,211,861 and 6,429,846, for instance disclose embodiments on “haptic-enabled” user-interface input devices, which are incorporated herein by reference.
- the haptic feedback assembly 420 may be configured to output any form of force feedback as deemed suitable. In some applications, for instance, it may be desirable to effect tactile sensations, such as vibrations, pulses, and textures, on a user. Whereas in other applications, kinesthetic sensations may be produced in the degrees of freedom of motion of the user-manipulatable object (e.g., a joystick handle, mouse, steering wheel, etc.), so as to provide more dynamic interactions between the user and virtual pet.
- U.S. Pat. No. 5,734,373 discloses embodiments on generating tactile and kinesthetic feedback, which is incorporated herein by reference.
- embodiments of the invention may further allow the user to select or customize the haptic feedback that corresponds to a particular status of the virtual pet.
- a haptic effect that simulates a purring sensation may be output to the user by a haptic feedback assembly (e.g., the haptic feedback assembly 220 described above).
- the purring sensation may be triggered in response to the user “petting” the virtual pet with a cursor on the display screen (such as the display screen 260 of FIG. 2 or 3 ), e.g., by moving the cursor back and forth over the image of the displayed pet, or simply by actuating a petting button (e.g., an icon on the display screen, or a button on the user-interface object device 210 of FIG. 2 or 3 ).
- a petting button e.g., an icon on the display screen, or a button on the user-interface object device 210 of FIG. 2 or 3 .
- the purring sensation may be delivered to the user, when the user engages or contacts a pet with a cursor during a predetermined period of time, thereby simulating the physical experience of a cat that purrs when being pet and happy, for instance.
- Such purring sensation may be delivered in the form of a periodic vibration by the haptic feedback assembly.
- the magnitude and/or frequency of the purring vibration may vary with time, e.g., depending upon the user interaction.
- FIG. 5 shows an example of a waveform that may be used for generating a purring vibration, where the magnitude ramps and declines over a period of time. Such a purring waveform may be further repeated over time, so as to provide a sequence of purring vibrations as depicted in FIG. 6.
- the vibration cycles in FIG. 6 may also have different characteristics (e.g., magnitudes and/or frequencies). For instance, when a user is petting a virtual pet over an extended period of time, the magnitude (and optionally the frequency) may progressively increase.
- a user may check the heartbeat of his/her virtual pet as a way of check the health condition of the pet.
- the user may enter an input signal to prompt the heartbeat “measure” via a user-interface input device (e.g., the user-interface object 210 described above). Consequently, a data signal or indicator may be transmitted to the haptic feedback assembly that outputs a pulsing sensation to the user.
- the rate or magnitude of the pulsing sensation may be used to indicate the health state of the virtual pet: for instance, a slow (low frequency) and/or weak (low magnitude) pulse may signal an unhealthy pet that needs care.
- FIG. 7 shows an example of a waveform for a relatively “healthy” heartbeat, indicating that the virtual pet is in good health. This waveform causes the output of a relatively high frequency and high magnitude pulsing sensation.
- FIG. 8 shows an alternative waveform for a weakened health heartbeat, which is lower in magnitude and frequency than that shown in FIG. 7. This sensation informs the user the deterioration in the pet's health.
- FIG. 9 shows another waveform for a “near-death” heartbeat, signaling the grave condition of the virtual pet. It will be appreciated that this waveform is in notable contrast with that illustrated in FIG. 7.
- the heartbeat may be used to indicate a state of “exertion” or “excitement” of the virtual pet, e.g., a rapid heartbeat may convey such a state to the user.
- FIG. 10 depicts an example of a waveform for an “excited” heartbeat.
- Such a waveform may also be output, for example, when the user visits the virtual pet after a long period of absence, or when the user rewards the virtual pet, etc.
- a heartbeat of exertion e.g., as a result of battling other virtual pets, or walking in a virtual world, etc.
- a giggling sensation may be delivered to the user by way of the haptic feedback assembly.
- the user may move a cursor back and forth over the image of the virtual pet to mimic the action of tickling.
- a giggling sensation may be delivered to the user as a vibration sensation with varying magnitude and frequency.
- FIG. 11 displays an example of a waveform that may be used to effect a giggling sensation.
- the amplitude of the high-frequency sinusoid wave exhibits a modulated “envelope,” which is shown to start at a high level, ramp down to a lower level, and then ramp back up, and so on.
- This provides a high-frequency pulsing that varies in magnitude over time.
- appropriate visual and audio effects corresponding to the tickling action may also be produced, to complement the giggling sensation.
- a routine activity is “feeding” the pet.
- a tactile feedback may be output to the user to effect a “feeding sensation.”
- a feeding sensation may be in the form of a series of jolts, indicating that the pet is gulping down food, for instance.
- the feeding sensation may be delivered to the user as a continuous vibration, indicating that the pet is drinking liquid, or chewing vigorously.
- the feeding sensation may be also be delivered in coordination with visual images of the pet moving its mouth in chewing or gulping motion, along with corresponding sound effects.
- haptic effects may be further devised to convey other characteristics and abilities of a virtual pet.
- a tactile sensation may be delivered to the user to signal a virtual pet wagging its tail, where the magnitude and frequency of the vibration may be correlated with the graphical image of wagging.
- Appropriate haptic sensations may also be generated, corresponding to a virtual pet wagging its ears, panting, scratching fur or flea bites, stretching, or sleeping.
- a virtual pet may be equipped with an extraordinary power, such as the ability to shoot lightening bolts or breath fire. An appropriate haptic sensation may be devised to convey such power, as well.
- a pet In a virtual pet environment, a pet is often given a set of statistics that document the strength and vitality of the creature. Such statistics may be used when two pets “do battle.” For instance, when one pet owner is trying to decide if his/her pet should battle another pet, he/she may check the strength statistics related to both pets.
- An effective way of getting a sense of the “strength” of a potential opponent is by way of haptic sensation.
- a user may put a cursor over the image of a particular pet and feel a haptic sensation that conveys the strength of the pet.
- the haptic sensation in this case may be delivered in the form of a vibration, characterized by a magnitude that is scaled in accordance with the pet's strength statistics, for instance.
- virtual pets may be characterized by “popularity” statistics.
- a haptic sensation may be associated with a popularity statistic.
- an “unpopular” pet may be assigned with a soft, low frequency tactile sensation; whereas a popular pet may dictate a strong, high frequency tactile sensation.
- haptic sensations may likewise associated with other statistics of virtual pets.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Mathematical Physics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- The present application claims benefit of Provisional Patent Application No. 60/336,411, entitled “Using Haptic Feedback Peripheral Devices to Enhance Interaction with Computer Simulated Pets,” filed on Oct. 30, 2001, which is incorporated herein by reference.
- This invention relates generally to haptic systems, and more particularly, to interactive simulations and interface devices that incorporate haptic feedback.
- The advent of Internet and modem communication networks has brought a renewed life to simulated (or “virtual”) pets. In addition to stand-alone electronic pet toys (e.g., those known as “Tomagotcchi,” see U.S. Pat. No. 5,966,526 for example), a user nowadays can also create his/her own simulated (or “virtual”) pet, or order a virtual pet online, and rear the pet in a manner as he/she desires. Such virtual pets are typically programmed to adapt to their environments, and develop new traits and characteristics based upon their interactions with the owners. A virtual pet may further explore the online world, and participate in events as arranged by its owner, and so on. In such scenarios, however, the interaction between a virtual pet and its owner is limited to visual and/or auditory interaction. That is, the user misses a sense of touch with his/her pet, as experienced in the real world.
- Embodiments of the invention relate to methods and systems for providing haptic feedback to a user interacting with a simulated (or “virtual”) pet, so as to enhance the realism of the user's relationship with the virtual pet. The term “virtual pet” as used herein is construed broadly to refer to any simulated creature or character, which may or may not have a “real-life” counterpart.
- In one embodiment, a method of providing haptic feedback to a user interacting with a virtual pet comprises: receiving a signal relating to a biological status of the virtual pet, and outputting, to the user, a haptic effect based on the received signal.
- A further understanding of the invention will become apparent by reference to the remaining portions of the specification and drawings.
- FIG. 1 illustrates a flowchart depicting an embodiment of a method of the invention;
- FIG. 2 shows a block diagram of an embodiment of a haptic system of the invention;
- FIG. 3 depicts a block diagram of an alternative embodiment of a haptic system of the invention;
- FIG. 4 shows a block diagram of an embodiment of a haptic feedback assembly of the invention;
- FIG. 5 illustrates an embodiment of a single purring waveform;
- FIG. 6 shows an embodiment of a continuous purring waveform;
- FIG. 7 depicts an embodiment of a “healthy” heartbeat waveform;
- FIG. 8 shows an embodiment of a “weakened-health” heartbeat waveform;
- FIG. 9 illustrates an embodiment of a “near-death” heartbeat waveform;
- FIG. 10 depicts an embodiment of an “excited” heartbeat waveform; and
- FIG. 11 shows an embodiment of a giggling sensation waveform.
- In one embodiment, a method of providing haptic feedback to a user interacting with a virtual pet comprises: receiving a signal relating to a biological status of the virtual pet, and outputting, to the user, a haptic effect based on the received signal.
- As used herein, the term “biological status” is construed broadly to refer to a “state of being” of a virtual pet, such as for example a health or emotional state. Examples of the biological status include, but are not limited to: heartbeat, vitality, purring, giggling, being affectionate, and other personal traits. Such states of being are conveyed to a user by way of haptic effects generated based on the biological status of the virtual pet. The user may also experience responses related to feeding and other interactions with the virtual pet by way of appropriate haptic effects.
- The software application for controlling a virtual pet may be located on a local device (e.g., a computer or a hand-held device), where the signal relating to the biological status and associated haptic effect are determined at the local device. Alternatively, the software application for controlling a virtual pet may reside remotely, e.g., on a network resource, where the signal relating to the biological status along with associated haptic effect may be generated within the network and sent to a local device for interaction with the user.
- In another embodiment, a haptic system that provides haptic feedback to a user interacting with a virtual pet comprises: a user-interface object; a haptic feedback assembly coupled to the user-interface object; a controller in communication with the user-interface object and the haptic feedback assembly; and a memory storing a software. The controller executes the software so as to practice the above method, and the haptic feedback assembly is configured to output the haptic effect thus generated on the user-interface object. In another embodiment, the haptic system further comprises a display screen for displaying a visual image of the virtual pet. It may additionally include an audio element for providing an audio cue associated with the biological status of the virtual pet. Such visual and audio effects may be produced and coordinated in a manner that complements the haptic sensation experienced by the user.
- The haptic system described above may be embodied in a computer, a cell phone, a personal digital assistant (PDA), a pager, a game console, a stand-alone toy device (e.g., Tomagotcchi), or other types of hand-held electronic devices known in the art, which may be further equipped with network capabilities.
- FIG. 1 shows a
flowchart 100 depicting a method of providing haptic feedback to a user interacting with a virtual pet, according to an embodiment of the invention. It will be appreciated that the embodiment of FIG. 1 is provided by way of example to illustrate the principles of the invention, and should not be construed as limiting the scope of the invention in any manner. One skilled in the art would also recognize that various changes and modifications can be made herein, without departing from the principles and scope of the invention. - The
flowchart 100 of FIG. 1 comprises receiving a signal relating to a biological status of a virtual pet, as recited instep 110; and outputting, to a user, a haptic effect based on the received signal, as recited instep 120. - In
step 110 of FIG. 1, the term “receiving” is defined broadly to refer to receiving a signal relating to a biological status of the virtual pet from (or within) a local device; or receiving a signal relating to a biological status of the virtual pet from an outside (or “remote”) source, such as a network resource. The former pertains to a situation where the software application for controlling the virtual pet is located on a local device (such as a computer or a hand-held device), as described in further detail in FIG. 2 below. In this situation, the signal relating to the biological signal can be received, for example, by an actuator from a controller located within the local device. The latter pertains to a situation where the software application for controlling the virtual pet is remotely located on a network resource, where information related to the virtual pet is transmitted to a local device in contact with the user, as further depicted in FIG. 3. In this situation, the signal relating to the biological signal can be received from the network, for example, by an actuator located within the local device. - The term “biological status” refers to a “state of being” (or behavior) of the virtual pet, such as a health or emotional state. Examples of the biological status include, but are not limited to: heartbeat, vitality, purring, giggling, being affectionate, and other personal traits.
- In
step 120 of FIG. 1, a haptic effect is generated based on the received signal relating to the biological status of the virtual pet, and output to the user. The determination of the haptic effect may likewise be performed with a local device (such as a computer or a hand-held device). The determination of the haptic effect may also be performed within a network resource coupled to the local device; a signal or indication based on the determination of the haptic effect can be transmitted to the local device, which can output it to the user. The haptic effect thus generated serves to convey to the user a tactile or kinesthetic feedback associated with the biological state, hence enhancing the realism of the user-pet interaction. The user may also experience responses related to feeding and other interactions with the pet by way of appropriate haptic effects thus generated. - Further, the term “haptic effect” should be construed broadly as encompassing any type of force feedback, such as tactile or kinesthetic feedback, that is deemed appropriate for conveying a particular biological status of the virtual pet and thereby enhancing the realism of the user-pet interaction. See FIG. 4 for further detail.
- The embodiment of FIG. 1 may further comprise displaying a virtual image of the virtual pet, as recited in
step 130. It may also include generating an audio cue associated with the biological status of the virtual pet, as recited instep 140. Such visual and audio effects may be coordinated such to complement the haptic sensation experienced by the user. - The embodiment of FIG. 1 may additionally include modifying/updating the biological status of the virtual pet, as recited in
step 150. As a way of example, upon experiencing the haptic sensation related to a biological status (e.g., feeling lonely or hungry) of the virtual pet, the user may take upon action (e.g., touching or feeding the pet), which alters the biological status of the pet (e.g., purring or giggling). - The ensuing description discloses several embodiments, illustrating by way of example how the embodiment of FIG. 1 may be implemented. It will be appreciated that there are many alternative ways of practicing the present invention. Accordingly, various changes and modifications may be made herein, without departing from the principles and scope of the invention.
- FIG. 2 depicts a block diagram of a
haptic system 200, which may be utilized to provide haptic feedback to a user interacting with a virtual pet, according to an embodiment of the invention. As a way of example, thehaptic system 200 may include a user-interface object 210, ahaptic feedback assembly 220, alocal controller 230, andmemory 240 storing computer-executable software to be executed by thecontroller 230. Thehaptic feedback assembly 220 is configured to provide haptic feedback to the user-interface object 210. For instance, thehaptic feedback assembly 220 may be mechanically integrated with the user-interface object 210 to form a “haptic-enabled”unitary device 250, as described in further detail with respect to FIG. 4. Alternatively, thehaptic feedback assembly 220 can be mechanically engaged with the user-interface object 210 in a manner that effectively transmits the force feedback. Thehaptic feedback assembly 220 and the user-interface object 210 are further in communication with thecontroller 230, via for example a wired or wireless communication means known in the art. - In the embodiment of FIG. 2, the computer-executable software stored in the
memory 240 causes thelocal controller 230 to perform tasks when executing the software. More specifically, the computer-executable software causes thelocal controller 230 to receive an indicator or signal associated with a biological status of the virtual pet, which may be prompted by an input signal from the user-interface object 210. The computer-executable software further causes thelocal controller 230 to generate an indicator or signal associated with a haptic effect based on the received indicator or signal associated with the biological status. The generated indicator or signal associated with the haptic effect causes thehaptic feedback assembly 220 to output the haptic effect to the user. The biological status and/or the corresponding haptic effect may be selected, for example, from a database (e.g., stored in the memory 240), or generated in a dynamic manner. - The
haptic system 200 of FIG. 2 can optionally include adisplay screen 260, in communication with thecontroller 230, for displaying a visual image of the virtual pet. Thehaptic system 200 can optionally include anaudio element 270, in communication with thecontroller 230, for providing an audio cue associated with the biological status of the virtue pet. The software for generating such visual and/or audio signals may be stored in thememory 240 and executable by thecontroller 230. As will be appreciated by those skilled in the art, the visual, audio, and haptic effects as described may be produced and coordinated by thecontroller 230 in a manner that best enhances the realism of the user's interaction with the virtual pet. - Situations may exist where software application controlling the virtual pet is located on a remote source such as for example a network resource, and an indicator or signal associated with the biological status and an indicator or signal associated with the corresponding haptic effect are sent (or downloaded) from the network resource to the haptic feedback assembly in a local device configured to be in contact with the user. FIG. 3 depicts a
haptic system 300 pertaining to this scenario, according to an embodiment of the invention. By way of example, the embodiment of FIG. 3 may be based upon the embodiment of FIG. 2, hence has like elements labeled with similar numerals. In this case, anetwork resource 330 may be utilized, in lieu (or in conjunction with) thelocal controller 230 and thememory 240 in the embodiment of FIG. 2. - In FIG. 3, the network resource330 (e.g., a network server) may include for example a suitable controller and software executable by the controller (along with an appropriate database).
Network resource 330 can operate in a manner similar to those described above with respect to FIG. 2. Thenetwork resource 330 may determine a biological status of the virtual pet, which may be prompted for example by an input signal from the user-interface object 210. Thenetwork resource 330 can generate an indicator or signal associated with the determined biological status. Thenetwork resource 330 may further generate an indicator or signal associated with a haptic effect based on the biological status thus determined. Thenetwork resource 330 can send the indicator or signal associated with the haptic effect,to thehaptic feedback assembly 220 to output the haptic effect to the user-interface object 210. Thenetwork resource 330 may also provide an indicator or signal associated with a visual image of the virtual pet. Such an indicator or signal associated with a visual image of the virtual pet can be transmitted and displayed on thedisplay screen 260. In addition, thenetwork resource 330 may generate an audio signal associated with the biological status of the virtual pet and transmit the audio signal to theaudio element 270. As described above with respect to FIG. 2, such visual and audio cues may be coordinated in a manner that best complements the haptic sensation experienced by the user. - The
haptic system 200 of FIG. 2 (or 3) may be embodied for example in a personal computer (such as desktop or laptop), a work station, a kiosk, or one of a variety of home video game console systems commonly connected to a television set or other display screen. The user-interface object 210 may be for example a mouse, joystick, keyboard, touchpad, direction pad, gamdpad, trackball, remote control, or other types of user-interface input devices known in the art. The user may interact with the virtual pet (e.g., touching) by way of manipulating a cursor on the display screen 260 (e.g., a monitor), for instance. Thememory 240 includes, but is not limited to: random access memory (RAM), read-only memory (ROM), hard drives, DVD drives, CD-R/RW drive, floppy diskettes, photomagnetoelectric disks, magnetic tapes, or other data storage medium known in the art. The controller 230 (e.g., one or more processors in a computer) may be further equipped with a networking capability (e.g., being able to be connected to the Internet), so as to enable the user and virtual pet to explore the World Wide Web, for instance. - The
haptic system 200 of FIG. 2 (or 3) may also be embodied in a hand-held device, such as a cell phone, PDA, pager, a self-contained electronic toy such as “Tomagotchi”, a handheld video game unit (e.g., Nintendo Gameboy), and the like. The user-interface object may be provided by one or more physical (or soft) keys, scrollwheels, switches, or other types of user-interface input devices. One skilled in the art will recognize in some of these devices, a touch screen may be employed to serve as both a user-interface input device and a display means. - FIG. 4 shows a block diagram of an
exemplary embodiment 420 of a haptic feedback assembly, which may be used to configure thehaptic feedback assembly 220 of FIG. 2 (or 3). Thehaptic feedback assembly 420 may generally include aprocessor 421, one ormore sensors 422 along with associatedsensor interface 423 for detecting the motion of a user-interface object 410, and one ormore actuators 424 along with associatedactuator interface 425 for outputting forces on the a user-interface object 410. Theprocessor 421 may use the output from the sensors to control the actuators, so as to exert appropriate forces on the user-interface object 410 in accordance with its motion. The configuration and operation of such a haptic feedback assembly are described in greater detail in U.S. Pat. Nos. 5,734,373, 6,285,351, and 6,300,936, which are incorporated herein by reference. - As described above, the
haptic feedback assembly 420 and the user-interface object 410 may be mechanically integrated to form a “haptic-enabled” unitary device, such as the iFeel mouse manufactured by Logitech, Inc., and enabled by the TouchSense™ technology of Immersion Corporation. In one embodiment, such a mouse may be interfaced to a computer running a virtual pet software (e.g., an Internet-based virtual pet software from Neopets.com). Such software enables users to create their own pets, which may be selected from many different types and with a wide variety of characteristics. U.S. Pat. Nos. 6,211,861 and 6,429,846, for instance, disclose embodiments on “haptic-enabled” user-interface input devices, which are incorporated herein by reference. - Further, the
haptic feedback assembly 420 may be configured to output any form of force feedback as deemed suitable. In some applications, for instance, it may be desirable to effect tactile sensations, such as vibrations, pulses, and textures, on a user. Whereas in other applications, kinesthetic sensations may be produced in the degrees of freedom of motion of the user-manipulatable object (e.g., a joystick handle, mouse, steering wheel, etc.), so as to provide more dynamic interactions between the user and virtual pet. U.S. Pat. No. 5,734,373 discloses embodiments on generating tactile and kinesthetic feedback, which is incorporated herein by reference. - Optionally, embodiments of the invention may further allow the user to select or customize the haptic feedback that corresponds to a particular status of the virtual pet.
- The ensuing description discloses embodiments on producing haptic sensations associated with various biological states of a virtual pet.
- When a user, interacting with a virtual pet, takes an action that makes the pet happy, a haptic effect that simulates a purring sensation may be output to the user by a haptic feedback assembly (e.g., the
haptic feedback assembly 220 described above). The purring sensation may be triggered in response to the user “petting” the virtual pet with a cursor on the display screen (such as thedisplay screen 260 of FIG. 2 or 3), e.g., by moving the cursor back and forth over the image of the displayed pet, or simply by actuating a petting button (e.g., an icon on the display screen, or a button on the user-interface object device 210 of FIG. 2 or 3). The purring sensation may be delivered to the user, when the user engages or contacts a pet with a cursor during a predetermined period of time, thereby simulating the physical experience of a cat that purrs when being pet and happy, for instance. Such purring sensation may be delivered in the form of a periodic vibration by the haptic feedback assembly. The magnitude and/or frequency of the purring vibration may vary with time, e.g., depending upon the user interaction. As a way of example, FIG. 5 shows an example of a waveform that may be used for generating a purring vibration, where the magnitude ramps and declines over a period of time. Such a purring waveform may be further repeated over time, so as to provide a sequence of purring vibrations as depicted in FIG. 6. - In some embodiments, the vibration cycles in FIG. 6 may also have different characteristics (e.g., magnitudes and/or frequencies). For instance, when a user is petting a virtual pet over an extended period of time, the magnitude (and optionally the frequency) may progressively increase.
- In some embodiments, a user may check the heartbeat of his/her virtual pet as a way of check the health condition of the pet. The user may enter an input signal to prompt the heartbeat “measure” via a user-interface input device (e.g., the user-
interface object 210 described above). Consequently, a data signal or indicator may be transmitted to the haptic feedback assembly that outputs a pulsing sensation to the user. The rate or magnitude of the pulsing sensation may be used to indicate the health state of the virtual pet: for instance, a slow (low frequency) and/or weak (low magnitude) pulse may signal an unhealthy pet that needs care. - FIGS. 7, 8 and9 depict several waveforms that may be used for generating pulsing sensations related to different health conditions of a virtual pet. By way of example, FIG. 7 shows an example of a waveform for a relatively “healthy” heartbeat, indicating that the virtual pet is in good health. This waveform causes the output of a relatively high frequency and high magnitude pulsing sensation. FIG. 8 shows an alternative waveform for a weakened health heartbeat, which is lower in magnitude and frequency than that shown in FIG. 7. This sensation informs the user the deterioration in the pet's health. FIG. 9 shows another waveform for a “near-death” heartbeat, signaling the grave condition of the virtual pet. It will be appreciated that this waveform is in notable contrast with that illustrated in FIG. 7.
- In addition to health, the heartbeat may be used to indicate a state of “exertion” or “excitement” of the virtual pet, e.g., a rapid heartbeat may convey such a state to the user. By way of example, FIG. 10 depicts an example of a waveform for an “excited” heartbeat. Such a waveform may also be output, for example, when the user visits the virtual pet after a long period of absence, or when the user rewards the virtual pet, etc. In alternative embodiments, a heartbeat of exertion (e.g., as a result of battling other virtual pets, or walking in a virtual world, etc.) may be indicated by a waveform with gradually reduced magnitude and/or frequency.
- When a user interacts with a virtual pet in a manner that “tickles” the pet, a giggling sensation may be delivered to the user by way of the haptic feedback assembly. For example, the user may move a cursor back and forth over the image of the virtual pet to mimic the action of tickling. As a result, a giggling sensation may be delivered to the user as a vibration sensation with varying magnitude and frequency. By way of example, FIG. 11 displays an example of a waveform that may be used to effect a giggling sensation. In this example, the amplitude of the high-frequency sinusoid wave exhibits a modulated “envelope,” which is shown to start at a high level, ramp down to a lower level, and then ramp back up, and so on. This provides a high-frequency pulsing that varies in magnitude over time. As described above, appropriate visual and audio effects corresponding to the tickling action may also be produced, to complement the giggling sensation.
- In caring for a virtual (or real) pet, a routine activity is “feeding” the pet. When a virtual pet is eating, a tactile feedback may be output to the user to effect a “feeding sensation.” Such a feeding sensation may be in the form of a series of jolts, indicating that the pet is gulping down food, for instance. Alternatively, the feeding sensation may be delivered to the user as a continuous vibration, indicating that the pet is drinking liquid, or chewing vigorously. The feeding sensation may be also be delivered in coordination with visual images of the pet moving its mouth in chewing or gulping motion, along with corresponding sound effects.
- It will be appreciated that haptic effects may be further devised to convey other characteristics and abilities of a virtual pet. For example, a tactile sensation may be delivered to the user to signal a virtual pet wagging its tail, where the magnitude and frequency of the vibration may be correlated with the graphical image of wagging. Appropriate haptic sensations may also be generated, corresponding to a virtual pet wagging its ears, panting, scratching fur or flea bites, stretching, or sleeping. In addition, a virtual pet may be equipped with an extraordinary power, such as the ability to shoot lightening bolts or breath fire. An appropriate haptic sensation may be devised to convey such power, as well.
- In a virtual pet environment, a pet is often given a set of statistics that document the strength and vitality of the creature. Such statistics may be used when two pets “do battle.” For instance, when one pet owner is trying to decide if his/her pet should battle another pet, he/she may check the strength statistics related to both pets. An effective way of getting a sense of the “strength” of a potential opponent is by way of haptic sensation. As a way of example, a user may put a cursor over the image of a particular pet and feel a haptic sensation that conveys the strength of the pet. The haptic sensation in this case may be delivered in the form of a vibration, characterized by a magnitude that is scaled in accordance with the pet's strength statistics, for instance.
- Likewise, virtual pets may be characterized by “popularity” statistics. As in the case of the strength (or vitality) statistics, a haptic sensation may be associated with a popularity statistic. For example, an “unpopular” pet may be assigned with a soft, low frequency tactile sensation; whereas a popular pet may dictate a strong, high frequency tactile sensation. Those skilled in the art will appreciate that haptic sensations may likewise associated with other statistics of virtual pets.
- Those skilled in the art will recognize that the embodiments described above are provided by way of example, to elucidate the general principles of the invention. Various means and methods can be devised to perform the designated functions in an equivalent manner. Moreover, various changes, substitutions, and alternations can be made herein without departing from the principles and the scope of the invention.
Claims (40)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/283,258 US8788253B2 (en) | 2001-10-30 | 2002-10-30 | Methods and apparatus for providing haptic feedback in interacting with virtual pets |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US33641101P | 2001-10-30 | 2001-10-30 | |
US10/283,258 US8788253B2 (en) | 2001-10-30 | 2002-10-30 | Methods and apparatus for providing haptic feedback in interacting with virtual pets |
Publications (2)
Publication Number | Publication Date |
---|---|
US20030080987A1 true US20030080987A1 (en) | 2003-05-01 |
US8788253B2 US8788253B2 (en) | 2014-07-22 |
Family
ID=23315968
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/283,258 Expired - Fee Related US8788253B2 (en) | 2001-10-30 | 2002-10-30 | Methods and apparatus for providing haptic feedback in interacting with virtual pets |
Country Status (7)
Country | Link |
---|---|
US (1) | US8788253B2 (en) |
EP (1) | EP1440414B1 (en) |
JP (4) | JP2005511226A (en) |
KR (1) | KR20040062601A (en) |
CN (1) | CN100474216C (en) |
AU (1) | AU2002364690A1 (en) |
WO (1) | WO2003051062A2 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050124412A1 (en) * | 2003-12-05 | 2005-06-09 | Wookho Son | Haptic simulation system and method for providing real-time haptic interaction in virtual simulation |
EP2096520A1 (en) * | 2008-02-26 | 2009-09-02 | Deutsche Telekom AG | Mobile device and operation method |
US20100041312A1 (en) * | 2008-08-15 | 2010-02-18 | Paul King | Electronic toy and methods of interacting therewith |
US20100217883A1 (en) * | 2009-02-20 | 2010-08-26 | Drew Goya | Intelligent software agents for multiple platforms |
US8353767B1 (en) * | 2007-07-13 | 2013-01-15 | Ganz | System and method for a virtual character in a virtual world to interact with a user |
US20140078102A1 (en) * | 2012-02-03 | 2014-03-20 | Panasonic Corporation | Haptic feedback device, method for driving haptic feedback device, and drive program |
US8939840B2 (en) | 2009-07-29 | 2015-01-27 | Disney Enterprises, Inc. | System and method for playsets using tracked objects and corresponding virtual worlds |
WO2020124118A1 (en) * | 2018-12-21 | 2020-06-25 | Rasinger Pascale | Method and device for imitating a cat's purr |
CN112738537A (en) * | 2020-12-24 | 2021-04-30 | 珠海格力电器股份有限公司 | Virtual pet interaction method and device, electronic equipment and storage medium |
US11775134B2 (en) | 2017-11-13 | 2023-10-03 | Snap Inc. | Interface to display animated icon |
US11841989B2 (en) * | 2020-12-28 | 2023-12-12 | Beijing Xiaomi Mobile Software Co., Ltd. | Method, device, and storage medium for generating a haptic feedback |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2416962B (en) * | 2004-08-05 | 2009-04-01 | Vodafone Plc | New communication type for mobile telecommunications networks |
KR100715451B1 (en) * | 2004-12-28 | 2007-05-09 | 학교법인 성균관대학 | The five senses fusion and representation system using association function |
WO2007030603A2 (en) | 2005-09-08 | 2007-03-15 | Wms Gaming Inc. | Gaming machine having display with sensory feedback |
WO2007117418A2 (en) | 2006-03-31 | 2007-10-18 | Wms Gaming Inc. | Portable wagering game with vibrational cues and feedback mechanism |
US20090015563A1 (en) * | 2007-07-11 | 2009-01-15 | John Thomas Sadler | Stylized interactive icon for portable mobile communications device |
US8004391B2 (en) * | 2008-11-19 | 2011-08-23 | Immersion Corporation | Method and apparatus for generating mood-based haptic feedback |
EP2457142B1 (en) | 2009-07-22 | 2019-12-25 | Immersion Corporation | Interactive touch screen gaming metaphors with haptic feedback across platforms |
US20120302323A1 (en) | 2011-05-23 | 2012-11-29 | Wms Gaming Inc. | Haptic gaming chairs and wagering game systems and machines with a haptic gaming chair |
US9142083B2 (en) | 2011-06-13 | 2015-09-22 | Bally Gaming, Inc. | Convertible gaming chairs and wagering game systems and machines with a convertible gaming chair |
CN102843334A (en) * | 2011-06-20 | 2012-12-26 | 华为技术有限公司 | Interactive method of online application, server, client device and system |
US10976819B2 (en) | 2015-12-28 | 2021-04-13 | Microsoft Technology Licensing, Llc | Haptic feedback for non-touch surface interaction |
CN109045690B (en) * | 2018-07-27 | 2020-09-18 | 腾讯科技(深圳)有限公司 | Virtual pet obtaining method and device, electronic device and storage medium |
CN111784805B (en) * | 2020-07-03 | 2024-02-09 | 珠海金山数字网络科技有限公司 | Virtual character interaction feedback method and device |
KR20230155093A (en) | 2022-05-03 | 2023-11-10 | 주식회사 아이하랑 | Apparatus of pet growth learning applying the deep learning in the virtual reality and method of the same |
Citations (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US611577A (en) * | 1898-09-27 | griffin | ||
US2972140A (en) * | 1958-09-23 | 1961-02-14 | Hirsch Joseph | Apparatus and method for communication through the sense of touch |
US3157853A (en) * | 1957-12-06 | 1964-11-17 | Hirsch Joseph | Tactile communication system |
US3220121A (en) * | 1962-07-08 | 1965-11-30 | Communications Patents Ltd | Ground-based flight training or simulating apparatus |
US3497668A (en) * | 1966-08-25 | 1970-02-24 | Joseph Hirsch | Tactile control system |
US3517446A (en) * | 1967-04-19 | 1970-06-30 | Singer General Precision | Vehicle trainer controls and control loading |
US3623064A (en) * | 1968-10-11 | 1971-11-23 | Bell & Howell Co | Paging receiver having cycling eccentric mass |
US3902687A (en) * | 1973-06-25 | 1975-09-02 | Robert E Hightower | Aircraft indicator system |
US3903614A (en) * | 1970-03-27 | 1975-09-09 | Singer Co | Apparatus for simulating aircraft control loading |
US3911416A (en) * | 1974-08-05 | 1975-10-07 | Motorola Inc | Silent call pager |
US4127752A (en) * | 1977-10-13 | 1978-11-28 | Sheldahl, Inc. | Tactile touch switch panel |
US4160508A (en) * | 1977-08-19 | 1979-07-10 | Nasa | Controller arm for a remotely related slave arm |
US4262549A (en) * | 1978-05-10 | 1981-04-21 | Schwellenbach Donald D | Variable mechanical vibrator |
US4320268A (en) * | 1980-02-19 | 1982-03-16 | General Electric Company | Illuminated keyboard for electronic devices and the like |
US4321441A (en) * | 1980-02-04 | 1982-03-23 | Xerox Corporation | Keyswitch arrangement |
US4333070A (en) * | 1981-02-06 | 1982-06-01 | Barnes Robert W | Motor vehicle fuel-waste indicator |
US4464117A (en) * | 1980-08-27 | 1984-08-07 | Dr. Ing. Reiner Foerst Gmbh | Driving simulator apparatus |
US4484191A (en) * | 1982-06-14 | 1984-11-20 | Vavra George S | Tactile signaling systems for aircraft |
US4513235A (en) * | 1982-01-22 | 1985-04-23 | British Aerospace Public Limited Company | Control apparatus |
US4581491A (en) * | 1984-05-04 | 1986-04-08 | Research Corporation | Wearable tactile sensory aid providing information on voice pitch and intonation patterns |
US4599070A (en) * | 1981-07-29 | 1986-07-08 | Control Interface Company Limited | Aircraft simulator and simulated control system therefor |
US4708656A (en) * | 1985-11-11 | 1987-11-24 | Fokker B.V. | Simulator of mechanical properties of a steering system |
US4795296A (en) * | 1986-11-17 | 1989-01-03 | California Institute Of Technology | Hand-held robot end effector controller having movement and force control |
US4798919A (en) * | 1987-04-28 | 1989-01-17 | International Business Machines Corporation | Graphics input tablet with three-dimensional data |
US4821030A (en) * | 1986-12-19 | 1989-04-11 | Tektronix, Inc. | Touchscreen feedback system |
US4891764A (en) * | 1985-12-06 | 1990-01-02 | Tensor Development Inc. | Program controlled force measurement and control system |
US4930770A (en) * | 1988-12-01 | 1990-06-05 | Baker Norman A | Eccentrically loaded computerized positive/negative exercise machine |
US4934694A (en) * | 1985-12-06 | 1990-06-19 | Mcintosh James L | Computer controlled exercise system |
US5019761A (en) * | 1989-02-21 | 1991-05-28 | Kraft Brett W | Force feedback control for backhoe |
US5022407A (en) * | 1990-01-24 | 1991-06-11 | Topical Testing, Inc. | Apparatus for automated tactile testing |
US5022384A (en) * | 1990-05-14 | 1991-06-11 | Capitol Systems | Vibrating/massage chair |
US5035242A (en) * | 1990-04-16 | 1991-07-30 | David Franklin | Method and apparatus for sound responsive tactile stimulation of deaf individuals |
US5038089A (en) * | 1988-03-23 | 1991-08-06 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Synchronized computational architecture for generalized bilateral control of robot arms |
US5078152A (en) * | 1985-06-23 | 1992-01-07 | Loredan Biomedical, Inc. | Method for diagnosis and/or training of proprioceptor feedback capabilities in a muscle and joint system of a human patient |
US5165897A (en) * | 1990-08-10 | 1992-11-24 | Tini Alloy Company | Programmable tactile stimulator array system and method of operation |
US5182557A (en) * | 1989-09-20 | 1993-01-26 | Semborg Recrob, Corp. | Motorized joystick |
US5186695A (en) * | 1989-02-03 | 1993-02-16 | Loredan Biomedical, Inc. | Apparatus for controlled exercise and diagnosis of human performance |
US5212473A (en) * | 1991-02-21 | 1993-05-18 | Typeright Keyboard Corp. | Membrane keyboard and method of using same |
US5223658A (en) * | 1989-01-25 | 1993-06-29 | Yamaha Corporation | Electronic keyboard instrument with pad |
US5237327A (en) * | 1990-11-19 | 1993-08-17 | Sony Corporation | Remote commander |
US5240417A (en) * | 1991-03-14 | 1993-08-31 | Atari Games Corporation | System and method for bicycle riding simulation |
US5246316A (en) * | 1992-03-06 | 1993-09-21 | Excellon Automation | Work table orientation apparatus and method |
US5275174A (en) * | 1985-10-30 | 1994-01-04 | Cook Jonathan A | Repetitive strain injury assessment |
US5283970A (en) * | 1992-09-25 | 1994-02-08 | Strombecker Corporation | Toy guns |
US5299810A (en) * | 1991-03-21 | 1994-04-05 | Atari Games Corporation | Vehicle simulator including cross-network feedback |
US5309140A (en) * | 1991-11-26 | 1994-05-03 | The United States Of America As Represented By The Secretary Of The Navy | Feedback system for remotely operated vehicles |
US5334027A (en) * | 1991-02-25 | 1994-08-02 | Terry Wherlock | Big game fish training and exercise device and method |
US5355148A (en) * | 1993-01-14 | 1994-10-11 | Ast Research, Inc. | Fingerpoint mouse |
US5390128A (en) * | 1993-04-12 | 1995-02-14 | Cargill Detroit Corporation | Robotic processing and inspection system |
US5390296A (en) * | 1989-08-30 | 1995-02-14 | Comshare Incorporated | Method and apparatus for calculation with display data |
US5402499A (en) * | 1992-08-07 | 1995-03-28 | Lsi Logic Corporation | Multimedia controller |
US5436622A (en) * | 1993-07-06 | 1995-07-25 | Motorola, Inc. | Variable frequency vibratory alert method and structure |
US5437607A (en) * | 1992-06-02 | 1995-08-01 | Hwe, Inc. | Vibrating massage apparatus |
US5451924A (en) * | 1993-01-14 | 1995-09-19 | Massachusetts Institute Of Technology | Apparatus for providing sensory substitution of force feedback |
US5461711A (en) * | 1993-12-22 | 1995-10-24 | Interval Research Corporation | Method and system for spatial accessing of time-based information |
US5466213A (en) * | 1993-07-06 | 1995-11-14 | Massachusetts Institute Of Technology | Interactive robotic therapist |
US5524195A (en) * | 1993-05-24 | 1996-06-04 | Sun Microsystems, Inc. | Graphical user interface for interactive television with an animated agent |
US5547382A (en) * | 1990-06-28 | 1996-08-20 | Honda Giken Kogyo Kabushiki Kaisha | Riding simulation system for motorcycles |
US5565840A (en) * | 1994-09-21 | 1996-10-15 | Thorner; Craig | Tactile sensation generator |
US5575761A (en) * | 1994-07-27 | 1996-11-19 | Hajianpour; Mohammed-Ali | Massage device applying variable-frequency vibration in a variable pulse sequence |
US5631861A (en) * | 1990-02-02 | 1997-05-20 | Virtual Technologies, Inc. | Force feedback and texture simulating interface device |
US5669818A (en) * | 1995-03-23 | 1997-09-23 | Thorner; Craig | Seat-based tactile sensation generator |
US5684722A (en) * | 1994-09-21 | 1997-11-04 | Thorner; Craig | Apparatus and method for generating a control signal for a tactile sensation generator |
US5690582A (en) * | 1993-02-02 | 1997-11-25 | Tectrix Fitness Equipment, Inc. | Interactive exercise apparatus |
US5709219A (en) * | 1994-01-27 | 1998-01-20 | Microsoft Corporation | Method and apparatus to create a complex tactile sensation |
US5729249A (en) * | 1991-11-26 | 1998-03-17 | Itu Research, Inc. | Touch sensitive input control device |
US5766016A (en) * | 1994-11-14 | 1998-06-16 | Georgia Tech Research Corporation | Surgical simulator and method for simulating surgical procedure |
US5767457A (en) * | 1995-11-13 | 1998-06-16 | Cirque Corporation | Apparatus and method for audible feedback from input device |
US5785630A (en) * | 1993-02-02 | 1998-07-28 | Tectrix Fitness Equipment, Inc. | Interactive exercise apparatus |
US5791992A (en) * | 1996-07-31 | 1998-08-11 | International Business Machines Corporation | Video game system with internet cartridge |
US5857986A (en) * | 1996-05-24 | 1999-01-12 | Moriyasu; Hiro | Interactive vibrator for multimedia |
US5889670A (en) * | 1991-10-24 | 1999-03-30 | Immersion Corporation | Method and apparatus for tactilely responsive user interface |
US5887995A (en) * | 1997-09-23 | 1999-03-30 | Compaq Computer Corporation | Touchpad overlay with tactile response |
US5945772A (en) * | 1998-05-29 | 1999-08-31 | Motorla, Inc. | Damped resonant piezoelectric alerting device |
US5956484A (en) * | 1995-12-13 | 1999-09-21 | Immersion Corporation | Method and apparatus for providing force feedback over a computer network |
US6078126A (en) * | 1998-05-29 | 2000-06-20 | Motorola, Inc. | Resonant piezoelectric alerting device |
US6097964A (en) * | 1997-09-04 | 2000-08-01 | Nokia Mobile Phones Limited | Navigation key for a handset |
US6131097A (en) * | 1992-12-02 | 2000-10-10 | Immersion Corporation | Haptic authoring |
US6198206B1 (en) * | 1998-03-20 | 2001-03-06 | Active Control Experts, Inc. | Inertial/audio unit and construction |
US6218966B1 (en) * | 1998-11-05 | 2001-04-17 | International Business Machines Corporation | Tactile feedback keyboard |
US6219034B1 (en) * | 1998-02-23 | 2001-04-17 | Kristofer E. Elbing | Tactile computer interface |
US6225976B1 (en) * | 1998-10-30 | 2001-05-01 | Interlink Electronics, Inc. | Remote computer input peripheral |
US6273815B1 (en) * | 1999-06-08 | 2001-08-14 | Katherine C. Stuckman | Virtual electronic pet and method for use therewith |
US6273019B1 (en) * | 1999-04-28 | 2001-08-14 | Eli Shmid | Regulated pressurized system and pressure regulator for use in an ambient fluid environment, and method of pressure regulation |
US6287193B1 (en) * | 1999-02-02 | 2001-09-11 | Steven F. Rehkemper | Hand-held game with visual display and feedback |
US6290566B1 (en) * | 1997-08-27 | 2001-09-18 | Creator, Ltd. | Interactive talking toy |
US6297193B1 (en) * | 1997-12-18 | 2001-10-02 | Kyowa Chemical Industry Co., Ltd. | Algal growth or microbial proliferation inhibitors and use thereof |
US20020033795A1 (en) * | 2000-01-19 | 2002-03-21 | Shahoian Erik J. | Haptic interface for laptop computers and other portable devices |
US6374255B1 (en) * | 1996-05-21 | 2002-04-16 | Immersion Corporation | Haptic authoring |
US6422941B1 (en) * | 1994-09-21 | 2002-07-23 | Craig Thorner | Universal tactile feedback system for computer video games and simulations |
US6429846B2 (en) * | 1998-06-23 | 2002-08-06 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
US6438457B1 (en) * | 1997-08-22 | 2002-08-20 | Sony Corporation | Storage medium, robot, information processing device and electronic pet system |
US20020128048A1 (en) * | 2001-03-12 | 2002-09-12 | Nokia Mobile Phones Ltd. | Mobile phone featuring audio-modulated vibrotactile module |
US6543487B2 (en) * | 2001-02-08 | 2003-04-08 | Air Logistics Corporation | Method of securing a curing composite substrate wrap to a structure |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4236325A (en) | 1978-12-26 | 1980-12-02 | The Singer Company | Simulator control loading inertia compensator |
US4791416A (en) | 1985-02-05 | 1988-12-13 | Zenith Electronics Corporation | Touch control system for controllable apparatus |
US4713007A (en) | 1985-10-11 | 1987-12-15 | Alban Eugene P | Aircraft controls simulator |
US4794392A (en) | 1987-02-20 | 1988-12-27 | Motorola, Inc. | Vibrator alert device for a communication receiver |
US4885565A (en) | 1988-06-01 | 1989-12-05 | General Motors Corporation | Touchscreen CRT with tactile feedback |
US5175459A (en) | 1991-08-19 | 1992-12-29 | Motorola, Inc. | Low profile vibratory alerting device |
US5271290A (en) | 1991-10-29 | 1993-12-21 | United Kingdom Atomic Energy Authority | Actuator assembly |
US5956501A (en) | 1997-01-10 | 1999-09-21 | Health Hero Network, Inc. | Disease simulation system and method |
US5629594A (en) | 1992-12-02 | 1997-05-13 | Cybernet Systems Corporation | Force feedback system |
EP0607580A1 (en) | 1993-01-21 | 1994-07-27 | International Business Machines Corporation | Tactile feedback mechanism for cursor control |
US5734373A (en) * | 1993-07-16 | 1998-03-31 | Immersion Human Interface Corporation | Method and apparatus for controlling force feedback interface systems utilizing a host computer |
EP0660258B1 (en) | 1993-12-20 | 2000-03-08 | Seiko Epson Corporation | Electronic pointing device |
US6160489A (en) | 1994-06-23 | 2000-12-12 | Motorola, Inc. | Wireless communication device adapted to generate a plurality of distinctive tactile alert patterns |
US6111577A (en) | 1996-04-04 | 2000-08-29 | Massachusetts Institute Of Technology | Method and apparatus for determining forces to be applied to a user through a haptic interface |
JPH1165417A (en) * | 1997-08-27 | 1999-03-05 | Omron Corp | Device and method for virtual pet raising and program record medium |
CN2308421Y (en) * | 1997-08-27 | 1999-02-24 | 仲伟实业股份有限公司 | Electronic pet machine with ligature function |
US6211861B1 (en) * | 1998-06-23 | 2001-04-03 | Immersion Corporation | Tactile mouse device |
US6650338B1 (en) * | 1998-11-24 | 2003-11-18 | Interval Research Corporation | Haptic interaction with video and image data |
US6149490A (en) * | 1998-12-15 | 2000-11-21 | Tiger Electronics, Ltd. | Interactive toy |
JP3648559B2 (en) * | 1999-05-31 | 2005-05-18 | カシオ計算機株式会社 | Electronic equipment with communication function |
JP2001038658A (en) * | 1999-08-04 | 2001-02-13 | Yamaha Motor Co Ltd | Tactile sense expressing system in robot |
US6963762B2 (en) | 2001-05-23 | 2005-11-08 | Nokia Corporation | Mobile phone using tactile icons |
-
2002
- 2002-10-30 KR KR10-2004-7006466A patent/KR20040062601A/en not_active Application Discontinuation
- 2002-10-30 CN CNB028217411A patent/CN100474216C/en not_active Expired - Fee Related
- 2002-10-30 WO PCT/US2002/034694 patent/WO2003051062A2/en active Application Filing
- 2002-10-30 EP EP02804680.3A patent/EP1440414B1/en not_active Expired - Lifetime
- 2002-10-30 US US10/283,258 patent/US8788253B2/en not_active Expired - Fee Related
- 2002-10-30 JP JP2003552003A patent/JP2005511226A/en not_active Withdrawn
- 2002-10-30 AU AU2002364690A patent/AU2002364690A1/en not_active Abandoned
-
2008
- 2008-08-08 JP JP2008206015A patent/JP2008259921A/en not_active Withdrawn
-
2012
- 2012-05-24 JP JP2012118164A patent/JP2012152627A/en not_active Withdrawn
-
2015
- 2015-07-17 JP JP2015142684A patent/JP2015226806A/en not_active Ceased
Patent Citations (100)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US611577A (en) * | 1898-09-27 | griffin | ||
US3157853A (en) * | 1957-12-06 | 1964-11-17 | Hirsch Joseph | Tactile communication system |
US2972140A (en) * | 1958-09-23 | 1961-02-14 | Hirsch Joseph | Apparatus and method for communication through the sense of touch |
US3220121A (en) * | 1962-07-08 | 1965-11-30 | Communications Patents Ltd | Ground-based flight training or simulating apparatus |
US3497668A (en) * | 1966-08-25 | 1970-02-24 | Joseph Hirsch | Tactile control system |
US3517446A (en) * | 1967-04-19 | 1970-06-30 | Singer General Precision | Vehicle trainer controls and control loading |
US3623064A (en) * | 1968-10-11 | 1971-11-23 | Bell & Howell Co | Paging receiver having cycling eccentric mass |
US3903614A (en) * | 1970-03-27 | 1975-09-09 | Singer Co | Apparatus for simulating aircraft control loading |
US3902687A (en) * | 1973-06-25 | 1975-09-02 | Robert E Hightower | Aircraft indicator system |
US3911416A (en) * | 1974-08-05 | 1975-10-07 | Motorola Inc | Silent call pager |
US4160508A (en) * | 1977-08-19 | 1979-07-10 | Nasa | Controller arm for a remotely related slave arm |
US4127752A (en) * | 1977-10-13 | 1978-11-28 | Sheldahl, Inc. | Tactile touch switch panel |
US4262549A (en) * | 1978-05-10 | 1981-04-21 | Schwellenbach Donald D | Variable mechanical vibrator |
US4321441A (en) * | 1980-02-04 | 1982-03-23 | Xerox Corporation | Keyswitch arrangement |
US4320268A (en) * | 1980-02-19 | 1982-03-16 | General Electric Company | Illuminated keyboard for electronic devices and the like |
US4464117A (en) * | 1980-08-27 | 1984-08-07 | Dr. Ing. Reiner Foerst Gmbh | Driving simulator apparatus |
US4333070A (en) * | 1981-02-06 | 1982-06-01 | Barnes Robert W | Motor vehicle fuel-waste indicator |
US4599070A (en) * | 1981-07-29 | 1986-07-08 | Control Interface Company Limited | Aircraft simulator and simulated control system therefor |
US4513235A (en) * | 1982-01-22 | 1985-04-23 | British Aerospace Public Limited Company | Control apparatus |
US4484191A (en) * | 1982-06-14 | 1984-11-20 | Vavra George S | Tactile signaling systems for aircraft |
US4581491A (en) * | 1984-05-04 | 1986-04-08 | Research Corporation | Wearable tactile sensory aid providing information on voice pitch and intonation patterns |
US5078152A (en) * | 1985-06-23 | 1992-01-07 | Loredan Biomedical, Inc. | Method for diagnosis and/or training of proprioceptor feedback capabilities in a muscle and joint system of a human patient |
US5275174B1 (en) * | 1985-10-30 | 1998-08-04 | Jonathan A Cook | Repetitive strain injury assessment |
US5275174A (en) * | 1985-10-30 | 1994-01-04 | Cook Jonathan A | Repetitive strain injury assessment |
US4708656A (en) * | 1985-11-11 | 1987-11-24 | Fokker B.V. | Simulator of mechanical properties of a steering system |
US4891764A (en) * | 1985-12-06 | 1990-01-02 | Tensor Development Inc. | Program controlled force measurement and control system |
US4934694A (en) * | 1985-12-06 | 1990-06-19 | Mcintosh James L | Computer controlled exercise system |
US4795296A (en) * | 1986-11-17 | 1989-01-03 | California Institute Of Technology | Hand-held robot end effector controller having movement and force control |
US4821030A (en) * | 1986-12-19 | 1989-04-11 | Tektronix, Inc. | Touchscreen feedback system |
US4798919A (en) * | 1987-04-28 | 1989-01-17 | International Business Machines Corporation | Graphics input tablet with three-dimensional data |
US5038089A (en) * | 1988-03-23 | 1991-08-06 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Synchronized computational architecture for generalized bilateral control of robot arms |
US4930770A (en) * | 1988-12-01 | 1990-06-05 | Baker Norman A | Eccentrically loaded computerized positive/negative exercise machine |
US5223658A (en) * | 1989-01-25 | 1993-06-29 | Yamaha Corporation | Electronic keyboard instrument with pad |
US5186695A (en) * | 1989-02-03 | 1993-02-16 | Loredan Biomedical, Inc. | Apparatus for controlled exercise and diagnosis of human performance |
US5019761A (en) * | 1989-02-21 | 1991-05-28 | Kraft Brett W | Force feedback control for backhoe |
US5390296A (en) * | 1989-08-30 | 1995-02-14 | Comshare Incorporated | Method and apparatus for calculation with display data |
US5182557A (en) * | 1989-09-20 | 1993-01-26 | Semborg Recrob, Corp. | Motorized joystick |
US5289273A (en) * | 1989-09-20 | 1994-02-22 | Semborg-Recrob, Corp. | Animated character system with real-time control |
US5022407A (en) * | 1990-01-24 | 1991-06-11 | Topical Testing, Inc. | Apparatus for automated tactile testing |
US5631861A (en) * | 1990-02-02 | 1997-05-20 | Virtual Technologies, Inc. | Force feedback and texture simulating interface device |
US6059506A (en) * | 1990-02-02 | 2000-05-09 | Virtual Technologies, Inc. | Force feedback and texture simulating interface device |
US5035242A (en) * | 1990-04-16 | 1991-07-30 | David Franklin | Method and apparatus for sound responsive tactile stimulation of deaf individuals |
US5022384A (en) * | 1990-05-14 | 1991-06-11 | Capitol Systems | Vibrating/massage chair |
US5547382A (en) * | 1990-06-28 | 1996-08-20 | Honda Giken Kogyo Kabushiki Kaisha | Riding simulation system for motorcycles |
US5165897A (en) * | 1990-08-10 | 1992-11-24 | Tini Alloy Company | Programmable tactile stimulator array system and method of operation |
US5237327A (en) * | 1990-11-19 | 1993-08-17 | Sony Corporation | Remote commander |
US5212473A (en) * | 1991-02-21 | 1993-05-18 | Typeright Keyboard Corp. | Membrane keyboard and method of using same |
US5334027A (en) * | 1991-02-25 | 1994-08-02 | Terry Wherlock | Big game fish training and exercise device and method |
US5240417A (en) * | 1991-03-14 | 1993-08-31 | Atari Games Corporation | System and method for bicycle riding simulation |
US5299810A (en) * | 1991-03-21 | 1994-04-05 | Atari Games Corporation | Vehicle simulator including cross-network feedback |
US5889672A (en) * | 1991-10-24 | 1999-03-30 | Immersion Corporation | Tactiley responsive user interface device and method therefor |
US5889670A (en) * | 1991-10-24 | 1999-03-30 | Immersion Corporation | Method and apparatus for tactilely responsive user interface |
US6195592B1 (en) * | 1991-10-24 | 2001-02-27 | Immersion Corporation | Method and apparatus for providing tactile sensations using an interface device |
US5309140A (en) * | 1991-11-26 | 1994-05-03 | The United States Of America As Represented By The Secretary Of The Navy | Feedback system for remotely operated vehicles |
US5729249A (en) * | 1991-11-26 | 1998-03-17 | Itu Research, Inc. | Touch sensitive input control device |
US5246316A (en) * | 1992-03-06 | 1993-09-21 | Excellon Automation | Work table orientation apparatus and method |
US5437607A (en) * | 1992-06-02 | 1995-08-01 | Hwe, Inc. | Vibrating massage apparatus |
US5402499A (en) * | 1992-08-07 | 1995-03-28 | Lsi Logic Corporation | Multimedia controller |
US5283970A (en) * | 1992-09-25 | 1994-02-08 | Strombecker Corporation | Toy guns |
US6131097A (en) * | 1992-12-02 | 2000-10-10 | Immersion Corporation | Haptic authoring |
US5451924A (en) * | 1993-01-14 | 1995-09-19 | Massachusetts Institute Of Technology | Apparatus for providing sensory substitution of force feedback |
US5355148A (en) * | 1993-01-14 | 1994-10-11 | Ast Research, Inc. | Fingerpoint mouse |
US5690582A (en) * | 1993-02-02 | 1997-11-25 | Tectrix Fitness Equipment, Inc. | Interactive exercise apparatus |
US5785630A (en) * | 1993-02-02 | 1998-07-28 | Tectrix Fitness Equipment, Inc. | Interactive exercise apparatus |
US5390128A (en) * | 1993-04-12 | 1995-02-14 | Cargill Detroit Corporation | Robotic processing and inspection system |
US5524195A (en) * | 1993-05-24 | 1996-06-04 | Sun Microsystems, Inc. | Graphical user interface for interactive television with an animated agent |
US5466213A (en) * | 1993-07-06 | 1995-11-14 | Massachusetts Institute Of Technology | Interactive robotic therapist |
US5436622A (en) * | 1993-07-06 | 1995-07-25 | Motorola, Inc. | Variable frequency vibratory alert method and structure |
US5461711A (en) * | 1993-12-22 | 1995-10-24 | Interval Research Corporation | Method and system for spatial accessing of time-based information |
US5709219A (en) * | 1994-01-27 | 1998-01-20 | Microsoft Corporation | Method and apparatus to create a complex tactile sensation |
US5575761A (en) * | 1994-07-27 | 1996-11-19 | Hajianpour; Mohammed-Ali | Massage device applying variable-frequency vibration in a variable pulse sequence |
US6422941B1 (en) * | 1994-09-21 | 2002-07-23 | Craig Thorner | Universal tactile feedback system for computer video games and simulations |
US5684722A (en) * | 1994-09-21 | 1997-11-04 | Thorner; Craig | Apparatus and method for generating a control signal for a tactile sensation generator |
US5565840A (en) * | 1994-09-21 | 1996-10-15 | Thorner; Craig | Tactile sensation generator |
US5766016A (en) * | 1994-11-14 | 1998-06-16 | Georgia Tech Research Corporation | Surgical simulator and method for simulating surgical procedure |
US5669818A (en) * | 1995-03-23 | 1997-09-23 | Thorner; Craig | Seat-based tactile sensation generator |
US5767457A (en) * | 1995-11-13 | 1998-06-16 | Cirque Corporation | Apparatus and method for audible feedback from input device |
US6101530A (en) * | 1995-12-13 | 2000-08-08 | Immersion Corporation | Force feedback provided over a computer network |
US5956484A (en) * | 1995-12-13 | 1999-09-21 | Immersion Corporation | Method and apparatus for providing force feedback over a computer network |
US6374255B1 (en) * | 1996-05-21 | 2002-04-16 | Immersion Corporation | Haptic authoring |
US5857986A (en) * | 1996-05-24 | 1999-01-12 | Moriyasu; Hiro | Interactive vibrator for multimedia |
US5791992A (en) * | 1996-07-31 | 1998-08-11 | International Business Machines Corporation | Video game system with internet cartridge |
US6438457B1 (en) * | 1997-08-22 | 2002-08-20 | Sony Corporation | Storage medium, robot, information processing device and electronic pet system |
US6290566B1 (en) * | 1997-08-27 | 2001-09-18 | Creator, Ltd. | Interactive talking toy |
US6097964A (en) * | 1997-09-04 | 2000-08-01 | Nokia Mobile Phones Limited | Navigation key for a handset |
US5887995A (en) * | 1997-09-23 | 1999-03-30 | Compaq Computer Corporation | Touchpad overlay with tactile response |
US6297193B1 (en) * | 1997-12-18 | 2001-10-02 | Kyowa Chemical Industry Co., Ltd. | Algal growth or microbial proliferation inhibitors and use thereof |
US6219034B1 (en) * | 1998-02-23 | 2001-04-17 | Kristofer E. Elbing | Tactile computer interface |
US6198206B1 (en) * | 1998-03-20 | 2001-03-06 | Active Control Experts, Inc. | Inertial/audio unit and construction |
US5945772A (en) * | 1998-05-29 | 1999-08-31 | Motorla, Inc. | Damped resonant piezoelectric alerting device |
US6078126A (en) * | 1998-05-29 | 2000-06-20 | Motorola, Inc. | Resonant piezoelectric alerting device |
US6429846B2 (en) * | 1998-06-23 | 2002-08-06 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
US6225976B1 (en) * | 1998-10-30 | 2001-05-01 | Interlink Electronics, Inc. | Remote computer input peripheral |
US6218966B1 (en) * | 1998-11-05 | 2001-04-17 | International Business Machines Corporation | Tactile feedback keyboard |
US6287193B1 (en) * | 1999-02-02 | 2001-09-11 | Steven F. Rehkemper | Hand-held game with visual display and feedback |
US6273019B1 (en) * | 1999-04-28 | 2001-08-14 | Eli Shmid | Regulated pressurized system and pressure regulator for use in an ambient fluid environment, and method of pressure regulation |
US6273815B1 (en) * | 1999-06-08 | 2001-08-14 | Katherine C. Stuckman | Virtual electronic pet and method for use therewith |
US20020033795A1 (en) * | 2000-01-19 | 2002-03-21 | Shahoian Erik J. | Haptic interface for laptop computers and other portable devices |
US6543487B2 (en) * | 2001-02-08 | 2003-04-08 | Air Logistics Corporation | Method of securing a curing composite substrate wrap to a structure |
US20020128048A1 (en) * | 2001-03-12 | 2002-09-12 | Nokia Mobile Phones Ltd. | Mobile phone featuring audio-modulated vibrotactile module |
Non-Patent Citations (1)
Title |
---|
L.B. Rosenberg, A Force Feedback Programming Primer, Immersion Corporation, San Jose, California, 1997; 98 pages (2-sided). * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050124412A1 (en) * | 2003-12-05 | 2005-06-09 | Wookho Son | Haptic simulation system and method for providing real-time haptic interaction in virtual simulation |
US8353767B1 (en) * | 2007-07-13 | 2013-01-15 | Ganz | System and method for a virtual character in a virtual world to interact with a user |
EP2096520A1 (en) * | 2008-02-26 | 2009-09-02 | Deutsche Telekom AG | Mobile device and operation method |
US20100041312A1 (en) * | 2008-08-15 | 2010-02-18 | Paul King | Electronic toy and methods of interacting therewith |
US20100217883A1 (en) * | 2009-02-20 | 2010-08-26 | Drew Goya | Intelligent software agents for multiple platforms |
US8939840B2 (en) | 2009-07-29 | 2015-01-27 | Disney Enterprises, Inc. | System and method for playsets using tracked objects and corresponding virtual worlds |
US20140078102A1 (en) * | 2012-02-03 | 2014-03-20 | Panasonic Corporation | Haptic feedback device, method for driving haptic feedback device, and drive program |
US9007323B2 (en) * | 2012-02-03 | 2015-04-14 | Panasonic Intellectual Property Management Co., Ltd. | Haptic feedback device, method for driving haptic feedback device, and drive program |
US11775134B2 (en) | 2017-11-13 | 2023-10-03 | Snap Inc. | Interface to display animated icon |
WO2020124118A1 (en) * | 2018-12-21 | 2020-06-25 | Rasinger Pascale | Method and device for imitating a cat's purr |
CN112738537A (en) * | 2020-12-24 | 2021-04-30 | 珠海格力电器股份有限公司 | Virtual pet interaction method and device, electronic equipment and storage medium |
US11841989B2 (en) * | 2020-12-28 | 2023-12-12 | Beijing Xiaomi Mobile Software Co., Ltd. | Method, device, and storage medium for generating a haptic feedback |
Also Published As
Publication number | Publication date |
---|---|
EP1440414B1 (en) | 2016-08-17 |
JP2008259921A (en) | 2008-10-30 |
EP1440414A4 (en) | 2008-05-28 |
CN1578964A (en) | 2005-02-09 |
AU2002364690A8 (en) | 2003-06-23 |
AU2002364690A1 (en) | 2003-06-23 |
CN100474216C (en) | 2009-04-01 |
JP2015226806A (en) | 2015-12-17 |
EP1440414A2 (en) | 2004-07-28 |
KR20040062601A (en) | 2004-07-07 |
WO2003051062A2 (en) | 2003-06-19 |
JP2012152627A (en) | 2012-08-16 |
WO2003051062A3 (en) | 2003-11-20 |
US8788253B2 (en) | 2014-07-22 |
JP2005511226A (en) | 2005-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1440414B1 (en) | Methods and apparatus for providing haptic feedback in interacting with virtual pets | |
JP4044114B2 (en) | Grasping computer interactive device | |
Brauner et al. | Increase physical fitness and create health awareness through exergames and gamification: The role of individual factors, motivation and acceptance | |
US20120190453A1 (en) | System and method for online-offline interactive experience | |
Zeagler et al. | Canine computer interaction: towards designing a touchscreen interface for working dogs | |
Sreedharan et al. | 3D input for 3D worlds | |
Moshayedi et al. | Design and development of cost-effective exergames for activity incrementation | |
Gomes et al. | ViPleo and PhyPleo: Artificial pet with two embodiments | |
TW581701B (en) | Recording medium, method of using a computer and computer for executing role-playing games | |
Li et al. | AI in the shell: Towards an understanding of integrated embodiment | |
Ritvo et al. | Designing for the exceptional user: Nonhuman animal-computer interaction (ACI) | |
KR20010113718A (en) | Diagnosis system, diagnosis apparatus, and diagnosis method | |
JP2005319191A (en) | Game system, program, information storage medium, and image generating method | |
KR100701237B1 (en) | Sensitive robot based on internet | |
JP2023111780A (en) | Program and information processing system | |
JP2022142113A (en) | Robot, robot control method, and program | |
Breyer et al. | Multi-device classification model for game interaction techniques | |
Oakley | Haptic augmentation of the cursor: Transforming virtual actions into physical actions | |
JP7163987B2 (en) | Equipment control device, equipment control method and program | |
Lenders | Physical Perception of a VR Handshake | |
van Ryn | Gestural economy and cooking mama: Playing with the politics of natural user interfaces | |
De Goetzen et al. | Multimodal design for enactive toys | |
Kaisar et al. | Ball Game Controller: A Tangible User Interface | |
US20200070046A1 (en) | System and method for controlling a virtual world character | |
Zhao et al. | Grow With Me: Exploring the Integration of Augmented Reality and Health Tracking Technologies to Promote Physical Activity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: IMMERSION CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROSENBERG, LOUIS B.;REEL/FRAME:017304/0092 Effective date: 20021030 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20220722 |