WO1999039789A1 - Interaction between software and an input device - Google Patents

Interaction between software and an input device Download PDF

Info

Publication number
WO1999039789A1
WO1999039789A1 PCT/US1999/002420 US9902420W WO9939789A1 WO 1999039789 A1 WO1999039789 A1 WO 1999039789A1 US 9902420 W US9902420 W US 9902420W WO 9939789 A1 WO9939789 A1 WO 9939789A1
Authority
WO
WIPO (PCT)
Prior art keywords
providing
user
actuator
input device
physical object
Prior art date
Application number
PCT/US1999/002420
Other languages
French (fr)
Inventor
Dan Klitsner
Brian Clemens
Gary Levenberg
Original Assignee
Klitsner Industrial Design, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/018,691 external-priority patent/US6322449B1/en
Priority claimed from US09/019,489 external-priority patent/US5992817A/en
Application filed by Klitsner Industrial Design, Llc filed Critical Klitsner Industrial Design, Llc
Priority to NZ506607A priority Critical patent/NZ506607A/en
Priority to EP99904579A priority patent/EP1154824A4/en
Priority to AU24950/99A priority patent/AU746932B2/en
Priority to CA002320078A priority patent/CA2320078A1/en
Publication of WO1999039789A1 publication Critical patent/WO1999039789A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0205Lever arrangements for operating keyboard cursor control keys in a joystick-like manner
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • A63F13/245Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0219Special purpose keyboards
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1018Calibration; Key and button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1062Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to a type of game, e.g. steering wheel
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • A63F2300/6081Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization

Definitions

  • the invention relates to interactions between software, particularly software games, and an input device such as a mechanical keyboard overlay.
  • Computer keyboards are well known. Typically, a standard computer keyboard is connected to a computer and used as an input device to efficiently enter data. For those people who possess the requisite dexterity and typing skills, standard computer keyboards are well suited for quickly entering text. However, the standard computer keyboard is not well suited for young children with limited dexterity or those not skilled at typing. Nor is a standard keyboard adept at engaging the imagination of young children or others.
  • peripheral input device such as a joystick, mouse or trackball
  • these peripheral input devices offer alternatives to the standard computer keyboard, they may be costly to purchase and maintain because of their inherent electronic complexity.
  • Mechanical joystick devices which mechanically couple to an underlying keyboard are also well known. These typically are simpler than their electronic counterparts.
  • the invention provides techniques for encouraging interaction between a user and a computer having an input device, such as a keyboard overlay, that includes actuators formed as three-dimensional representations of physical objects.
  • the actuators may represent a set of tools or objects in a play environment.
  • the techniques extend the fantasy and imaginative play normally associated with multi-media software to the input devices used to control the software. This promises to substantially increase the entertainment value of the W wO ⁇ 9 V9V/3J9V7 w89 PCT/US99/02420
  • interaction between a user and a computer is encouraged through use of an input device, such as a keyboard overlay, including actuators formed as three-dimensional representations of physical objects.
  • Software running on the computer causes the computer to receive an input signal generated by actuation of a particular actuator by the user, and to provide feedback to the user. The feedback is consistent with the particular physical object represented by the particular actuator. Implementations may include one or more of the following features.
  • providing feedback to the user may include displaying a video sequence representative of activity by the particular physical object.
  • the video sequence may include an image of the particular physical object and may represent a change in a state of the particular physical object.
  • Providing feedback to the user may include generating sound representative of the particular physical object.
  • the sound may be representative of the change in the state of the particular physical object, and may be synchronized with the change in state in the video sequence.
  • the sound may represent sound generated by the particular physical object, such as sound generated during use of the particular physical object. For example, when the particular physical object is a saw, the sound may represent the saw cutting wood.
  • the particular physical object represents a character
  • the sound may represent speech by the character.
  • the software may cause the computer to display an image of the particular physical object prior to actuation of the particular actuator.
  • Feedback provided to the user may include modifying the image of the particular physical object to represent a change caused by actuation of the particular actuator.
  • the software may cause the computer to prompt the user to actuate the particular actuator by, for example, generating a spoken prompt.
  • the spoken prompt does not necessarily specifically instruct the user to actuate the particular W wO ⁇ 9 v9v/ / 3 j 9y7v8a9y PCT/US99/02420
  • the spoken prompt may tell the character to perform an action.
  • the software may cause the computer to display an animated character, and to generate the spoken prompts in the voice of the animated character.
  • the software may further cause the computer to animate a face of the animated character to simulate speech by the character.
  • the software may generate spoken prompts by the animated character to guide the user through use of different actuators of the input device. For example, when the actuators represent a set of tools, the spoken prompts by the animated character may guide the user through a project using the tools.
  • the software may permit the user to modify the image of the particular physical object. For example, the software may permit the user to select an image for the particular physical object from a set of images related to the particular physical object. Thus, when a particular actuator represents a saw, the software may permit the user to select an image from a set of images of different kinds of saws.
  • the input device may be a keyboard overlay including actuators formed as three-dimensional representations of physical objects.
  • the keyboard overlay is configured such that actuation of an actuator presses at least one key of the keyboard.
  • the input device may have actuators formed as three-dimensional representations of a set of related physical objects, such as a set of tools.
  • the software may cause the computer to display imagery indicating progress in completion of a project using the set of tools.
  • Providing feedback to the user may include generating sounds associated with use of a tool represented by the particular actuator.
  • the tools may be, for example, a set of woodworking tools.
  • the set of woodworking tools may include, for example, representations of a saw, a drill, a sander, a sprayer, a screwdriver and screw, and a hammer and nail.
  • the actuators also may represent objects in a play environment. Feedback provided to the user may include imagery indicating progress in completion of an activity in the play environment. Feedback also may include generating sounds associated with an object represented by the actuator actuated by the user to produce the input signal. For example, when the objects are characters in the play environment, the feedback to the user may include speech in a voice associated with a character represented by the actuator.
  • the input device may have one or more unsecured components that are readily removable from the input device.
  • an unsecured component may be in the form of a character in the play environment.
  • the unsecured component may be for use in actuating a certain actuator, and may be representative of an object normally used to manipulate an object represented by the actuator.
  • the unsecured component may be representative of a hammer or screwdriver and the associated actuator may be representative of a nail or screw.
  • An actuator may move in a way that mimics motion normally associated with a corresponding physical object.
  • actuating the actuator may include turning the actuator.
  • the techniques may be implemented in computer hardware or software, or a combination of the two. However, the techniques are not limited to any particular hardware or software configuration; they may find applicability in any computing or processing environment that may be used for interactive games or other interactive activities.
  • the techniques are implemented in computer programs executing on programmable computers that each include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices.
  • Program code is applied to data entered using the input device to perform - 5 - the functions described and to generate output information.
  • the output information is applied to the one or more output devices.
  • Each program is preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system.
  • the programs can be implemented in assembly or machine language, if desired.
  • the language may be a compiled or interpreted language.
  • Each such computer program is preferably stored on a storage medium or device (e.g., CD-ROM, hard disk or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer to perform the procedures described in this document.
  • a storage medium or device e.g., CD-ROM, hard disk or magnetic diskette
  • the system may also be considered to be implemented as a computer-readable storage medium, configured with a computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner.
  • Fig. 1 is a block diagram of a keyboard overlay.
  • Figs. 2 and 3 are, respectively, left and right perspective views of a keyboard overlay that provides a set of tools.
  • Fig. 4 is a perspective view of a keyboard overlay that provides a play environment.
  • a computer system 100 includes a computer 105 having a processor, and a display 110 coupled to the computer.
  • Software running on the processor causes the processor to transmit video images to the display 110 for display by the display 110.
  • the software also causes the processor to transmit electrical signals to a set of speakers 115 attached to the computer.
  • the electrical - 6 - signals cause the speakers 115 to generate sounds in conjunction with the video images.
  • the software may be loaded into the computer using a computer-readable medium inserted into a disk drive 120 of the computer.
  • a keyboard 125 is also coupled to the computer.
  • the keyboard provides the processor with input signals in response to actuation of 130 of the keyboard.
  • the particular signal provided to the processor corresponds to the particular key pressed by a user of the computer.
  • the software causes the computer to generate the video images and associated sounds in response to signals from the keyboard 125.
  • a keyboard overlay 135 attaches to the keyboard 125.
  • the overlay 135 includes actuators 140 that provide three-dimensional representations of physical objects. Mechanisms within the overlay are coupled to the actuators 140 such that actuation of an actuator causes one or more keys 130 on the keyboard to be pressed.
  • an actuator 140 may constitute a toy saw that is manipulated like a real saw.
  • a keyboard overlay 200 provides a set of actuators shaped to represent a set of tools.
  • the overlay 200 is used in conjunction with software that permits a user (typically a young child) to manipulate the tools to build projects or take part in other activities which simulate use of the tools through audio-visual sequences produced by the computer for display and output by the display and the speakers.
  • the overlay 200 includes a housing 205 and a base 210.
  • the overlay 200 is configured to fit on top of an underlying keyboard such as the keyboard 125 shown in Fig. 1. More particularly, the base 210 engages the keyboard 125 to couple the overlay 200 to the keyboard 125 in a known orientation.
  • a strap (not shown) or other mechanism may be used to secure the overlay to the keyboard.
  • the housing 200 includes several actuators in the form of three- dimensional, representational objects.
  • the actuators include a saw 215, a screw 220, a nail 225, a sander 230, a sprayer 235, and a drill 240.
  • the housing 220 also includes a screwdriver 245 and a hammer 250 that are removable from the housing WO 9 y 9 w/3 J978»9 y PCT/US99/02420
  • a button 255 is also provided for use in controlling the computer.
  • the button 255 may be used to activate a help function of the software running on the computer.
  • the actuators are all three-dimensional representational toy objects that appear realistic to the user and function in a manner similar to the objects they represent.
  • the screwdriver 245 may be used to turn the screw 220 and the hammer 250 may be used to pound in the nail 225.
  • the saw 215 is actuated by a horizontal forwards-and-backwards sawing motion
  • the sander 230 is actuated by a horizontal side-to-side sanding motion
  • the sprayer 235 is actuated by pressing a trigger 260 of the sprayer
  • the drill 240 is actuated by turning a handle 265 of the drill.
  • Actuation of an actuator causes a corresponding key on the keyboard to be pressed.
  • the software responds by causing the computer to display an image of the object represented by the actuator being used and to produce an associated sound.
  • pressing the trigger 260 of the sprayer 235 causes the computer to display movement of a displayed trigger of a displayed image of the sprayer, to display paint spraying from the sprayer and onto a surface of a project being undertake, and to produce the sound of paint spraying.
  • the software running on the computer 105 in conjunction with the keyboard overlay 200, provides "hands on" fun for children or other users of the computer.
  • An animated character displayed by the computer guides the user through a variety of projects and games using the tools of the keyboard overlay. The animated character does so by prompting the user as to what tool should be used to perform a particular task.
  • the software and associated overlay permits the user to physically interact with the virtual world provided on the computer display. Because of the interaction with the keyboard overlay, the tools and the physical action required to use them are the focus of the game, rather than the objects being built or the tasks being performed. To accentuate this point, the animated character guide may be - 8 - configured to play up the importance of the tool set. For example, an introductory dialog by the animated character guide may say something like, "Hi kids. I'm Joe, welcome to my workshop ... Wow! ... That's a might fine work bench you have there!”, with the animated character guide appearing to look out of the display at the keyboard overlay as the guide discovers the work bench represented by the overlay.
  • the software also may permit the user to select the tools to be displayed in correspondence with the actuators of the overlay 200.
  • the user may be able to indicate that a sledgehammer should be displayed instead of a standard-sized hammer for use in hammering the nail 220.
  • the simulated use of the tool may vary based on the image selected.
  • the software also may provide humorous results when the user uses the wrong tool in response to a prompt or uses a particular tool for an excessive amount of time. For example, overuse of the sander 230 may cause wood being sanded to burst into flames.
  • a space-based playset device 300 is also in the form of a keyboard overlay.
  • the space-based playset device 300 includes a base 305 which is coupled to the three-dimensional representational objects.
  • the three-dimensional representational objects of the space-based playset device 300 include a stationary platform 310, a toy gun 320, a first seat 330, a second seat 340, a third seat 350, a fire button 360, a pair of throttle levers 370, a navigational button 380, a first movable platform 390, and a second movable platform 395.
  • the first seat 330, second seat 340, and third seat 350 are configured to hold an action figure. Each seat causes a corresponding key 130 to be pressed when the seat is pushed downward toward the base 305. Typically, a seat is pushed downward to press a key when the user places an action figure in the seat.
  • the first seat 330 generally is the gunner's seat and is occupied by a gunner action figure. Pressing this seat produces comments by the gunner or fires the phasers if the commander has asked the gunner to fire them.
  • the first seat 330 is configured to rotate and is coupled to the toy gun 320 such that the toy gun 320 moves in response to rotation of the first seat 330.
  • the second seat 340 generally is occupied by a commander action figure. Pressing this seat produces comments or commands by the commander.
  • the third seat 350 generally is occupied by a first mate action figure. Pressing this seat produces the first mate's voice, arms the shields, or causes other actions depending on the situation.
  • Pressing the fire button 360 causes display of cannons being fired.
  • Movement of the pair of throttle levers 370 causes corresponding keys to be pressed.
  • the software responds by causing the processor to "light speed” transition the display to a selected destination.
  • the navigation orb 380 is used to select the destination.
  • the first moveable platform 390 and the second moveable platform 395 are capable of being moved by the user downward toward the base 305.
  • the first movable platform 390 preferably provides a first station for a first robot action figure.
  • the second movable platform 395 preferably provides a second station for a second robot action figure. Downward movement of the platforms causes corresponding keys to be pressed. Movement of the second station produces comments by the second robot and a translation on the display. Movement of the first station produces comments or advice by the first robot.
  • the software for use in conjunction with the space-based playset device 300 combines "hands-on" action figure fantasy play with the richness and control of multimedia software.
  • the software permits the user to pilot a detailed model of a space ship to explore a fantasy universe. The user can seek guidance from favorite characters, and can battle enemies.
  • the software provides video clips associated with the characters represented by the action figures, permits the user to - 10 - explore new areas of the fantasy universe, and permits the user to interact with favorite characters.
  • Pressing on an actuator associated with a particular character causes the character to say lines in response to the action on the computer's display.
  • these lines are non-repeating so that successively pressing down on one character will cause the character to say a sequence of related lines.
  • Characters may respond to statements by previous characters or to situations on the screen. For example, if the commander says “First mate, put up the shields!, pressing the first mate may cause him to verbally reply and activate the shields. If the first robot is pressed before the first mate, he might say "Oh please hurry first mate.”
  • the user can operate the navigation and other controls to simulate travel to dozens of destinations in the fantasy universe. Commands from the commander guide the user by instructing the user when to fire weapons or take other actions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Interaction between a user and a computer (105) is encouraged by using an input device (135) including actuators (140) formed as three-dimensional representations of physical objects. Software running on the computer (105) receives an input signal generated by actuation of a particular actuator (140) by the user, and provides feedback to the user. The feedback is consistent with the particular physical object represented by the particular actuator (140). For example, if the particular actuator (140) represents a saw (215), the software causes the computer (105) to display an audio-visual sequence representative of movement of the saw (215).

Description

- 1 - INTERACTION BETWEEN SOFTWARE AND AN INPUT DEVICE
TECHNICAL FIELD
The invention relates to interactions between software, particularly software games, and an input device such as a mechanical keyboard overlay.
BACKGROUND
Computer keyboards are well known. Typically, a standard computer keyboard is connected to a computer and used as an input device to efficiently enter data. For those people who possess the requisite dexterity and typing skills, standard computer keyboards are well suited for quickly entering text. However, the standard computer keyboard is not well suited for young children with limited dexterity or those not skilled at typing. Nor is a standard keyboard adept at engaging the imagination of young children or others.
As an alternative to the standard keyboard, the use of a peripheral input device, such as a joystick, mouse or trackball, is well known. Although these peripheral input devices offer alternatives to the standard computer keyboard, they may be costly to purchase and maintain because of their inherent electronic complexity. Mechanical joystick devices which mechanically couple to an underlying keyboard are also well known. These typically are simpler than their electronic counterparts.
SUMMARY
The invention provides techniques for encouraging interaction between a user and a computer having an input device, such as a keyboard overlay, that includes actuators formed as three-dimensional representations of physical objects. For example, the actuators may represent a set of tools or objects in a play environment. The techniques extend the fantasy and imaginative play normally associated with multi-media software to the input devices used to control the software. This promises to substantially increase the entertainment value of the W wOυ 9 V9V/3J9V7 w89 PCT/US99/02420
- 2 - software, particularly for younger children who are not adept at using a keyboard or other traditional input device.
In one general aspect, interaction between a user and a computer is encouraged through use of an input device, such as a keyboard overlay, including actuators formed as three-dimensional representations of physical objects. Software running on the computer causes the computer to receive an input signal generated by actuation of a particular actuator by the user, and to provide feedback to the user. The feedback is consistent with the particular physical object represented by the particular actuator. Implementations may include one or more of the following features.
For example, providing feedback to the user may include displaying a video sequence representative of activity by the particular physical object. The video sequence may include an image of the particular physical object and may represent a change in a state of the particular physical object. Providing feedback to the user may include generating sound representative of the particular physical object. The sound may be representative of the change in the state of the particular physical object, and may be synchronized with the change in state in the video sequence. The sound may represent sound generated by the particular physical object, such as sound generated during use of the particular physical object. For example, when the particular physical object is a saw, the sound may represent the saw cutting wood. When the particular physical object represents a character, the sound may represent speech by the character.
The software may cause the computer to display an image of the particular physical object prior to actuation of the particular actuator. Feedback provided to the user may include modifying the image of the particular physical object to represent a change caused by actuation of the particular actuator.
The software may cause the computer to prompt the user to actuate the particular actuator by, for example, generating a spoken prompt. The spoken prompt does not necessarily specifically instruct the user to actuate the particular W wOυ 9 v9v//3j9y7v8a9y PCT/US99/02420
- 3 - actuator. For example, when the particular actuator represents a character, the spoken prompt may tell the character to perform an action.
The software may cause the computer to display an animated character, and to generate the spoken prompts in the voice of the animated character. The software may further cause the computer to animate a face of the animated character to simulate speech by the character. The software may generate spoken prompts by the animated character to guide the user through use of different actuators of the input device. For example, when the actuators represent a set of tools, the spoken prompts by the animated character may guide the user through a project using the tools.
The software may permit the user to modify the image of the particular physical object. For example, the software may permit the user to select an image for the particular physical object from a set of images related to the particular physical object. Thus, when a particular actuator represents a saw, the software may permit the user to select an image from a set of images of different kinds of saws.
The input device may be a keyboard overlay including actuators formed as three-dimensional representations of physical objects. The keyboard overlay is configured such that actuation of an actuator presses at least one key of the keyboard.
The input device may have actuators formed as three-dimensional representations of a set of related physical objects, such as a set of tools. The software may cause the computer to display imagery indicating progress in completion of a project using the set of tools. Providing feedback to the user may include generating sounds associated with use of a tool represented by the particular actuator.
The tools may be, for example, a set of woodworking tools. The set of woodworking tools may include, for example, representations of a saw, a drill, a sander, a sprayer, a screwdriver and screw, and a hammer and nail. The actuators also may represent objects in a play environment. Feedback provided to the user may include imagery indicating progress in completion of an activity in the play environment. Feedback also may include generating sounds associated with an object represented by the actuator actuated by the user to produce the input signal. For example, when the objects are characters in the play environment, the feedback to the user may include speech in a voice associated with a character represented by the actuator.
The input device may have one or more unsecured components that are readily removable from the input device. For example, when the actuators represent objects in a play environment, an unsecured component may be in the form of a character in the play environment.
The unsecured component may be for use in actuating a certain actuator, and may be representative of an object normally used to manipulate an object represented by the actuator. For example, the unsecured component may be representative of a hammer or screwdriver and the associated actuator may be representative of a nail or screw.
An actuator may move in a way that mimics motion normally associated with a corresponding physical object. For example, when the corresponding object is a screw, and the actuator is formed as a three-dimensional representation of a screw, actuating the actuator may include turning the actuator.
The techniques may be implemented in computer hardware or software, or a combination of the two. However, the techniques are not limited to any particular hardware or software configuration; they may find applicability in any computing or processing environment that may be used for interactive games or other interactive activities. Preferably, the techniques are implemented in computer programs executing on programmable computers that each include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices. Program code is applied to data entered using the input device to perform - 5 - the functions described and to generate output information. The output information is applied to the one or more output devices.
Each program is preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the programs can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language.
Each such computer program is preferably stored on a storage medium or device (e.g., CD-ROM, hard disk or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer to perform the procedures described in this document. The system may also be considered to be implemented as a computer-readable storage medium, configured with a computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner. Other features and advantages will be apparent from the following description, including the drawings, and from the claims.
DESCRIPTION OF DRAWINGS
Fig. 1 is a block diagram of a keyboard overlay. Figs. 2 and 3 are, respectively, left and right perspective views of a keyboard overlay that provides a set of tools.
Fig. 4 is a perspective view of a keyboard overlay that provides a play environment.
DETAILED DESCRIPTION
Referring to Fig. 1, a computer system 100 includes a computer 105 having a processor, and a display 110 coupled to the computer. Software running on the processor causes the processor to transmit video images to the display 110 for display by the display 110. The software also causes the processor to transmit electrical signals to a set of speakers 115 attached to the computer. The electrical - 6 - signals cause the speakers 115 to generate sounds in conjunction with the video images. The software may be loaded into the computer using a computer-readable medium inserted into a disk drive 120 of the computer.
A keyboard 125 is also coupled to the computer. The keyboard provides the processor with input signals in response to actuation of 130 of the keyboard. The particular signal provided to the processor corresponds to the particular key pressed by a user of the computer. The software causes the computer to generate the video images and associated sounds in response to signals from the keyboard 125. A keyboard overlay 135 attaches to the keyboard 125. The overlay 135 includes actuators 140 that provide three-dimensional representations of physical objects. Mechanisms within the overlay are coupled to the actuators 140 such that actuation of an actuator causes one or more keys 130 on the keyboard to be pressed. For example, an actuator 140 may constitute a toy saw that is manipulated like a real saw.
Referring to Figs. 2 and 3, a keyboard overlay 200 provides a set of actuators shaped to represent a set of tools. The overlay 200 is used in conjunction with software that permits a user (typically a young child) to manipulate the tools to build projects or take part in other activities which simulate use of the tools through audio-visual sequences produced by the computer for display and output by the display and the speakers.
The overlay 200 includes a housing 205 and a base 210. The overlay 200 is configured to fit on top of an underlying keyboard such as the keyboard 125 shown in Fig. 1. More particularly, the base 210 engages the keyboard 125 to couple the overlay 200 to the keyboard 125 in a known orientation. A strap (not shown) or other mechanism may be used to secure the overlay to the keyboard.
The housing 200 includes several actuators in the form of three- dimensional, representational objects. The actuators include a saw 215, a screw 220, a nail 225, a sander 230, a sprayer 235, and a drill 240. The housing 220 also includes a screwdriver 245 and a hammer 250 that are removable from the housing WO 9 y9 w/3 J978»9y PCT/US99/02420
- 7 - for use in manipulating, respectively, the screw 220 and the nail 225. A button 255 is also provided for use in controlling the computer. For example, the button 255 may be used to activate a help function of the software running on the computer. The actuators are all three-dimensional representational toy objects that appear realistic to the user and function in a manner similar to the objects they represent. For example, the screwdriver 245 may be used to turn the screw 220 and the hammer 250 may be used to pound in the nail 225. Similarly, the saw 215 is actuated by a horizontal forwards-and-backwards sawing motion, the sander 230 is actuated by a horizontal side-to-side sanding motion, the sprayer 235 is actuated by pressing a trigger 260 of the sprayer, and the drill 240 is actuated by turning a handle 265 of the drill.
Actuation of an actuator causes a corresponding key on the keyboard to be pressed. Typically, the software responds by causing the computer to display an image of the object represented by the actuator being used and to produce an associated sound. For example, pressing the trigger 260 of the sprayer 235 causes the computer to display movement of a displayed trigger of a displayed image of the sprayer, to display paint spraying from the sprayer and onto a surface of a project being undertake, and to produce the sound of paint spraying. The software running on the computer 105, in conjunction with the keyboard overlay 200, provides "hands on" fun for children or other users of the computer. An animated character displayed by the computer guides the user through a variety of projects and games using the tools of the keyboard overlay. The animated character does so by prompting the user as to what tool should be used to perform a particular task.
The software and associated overlay permits the user to physically interact with the virtual world provided on the computer display. Because of the interaction with the keyboard overlay, the tools and the physical action required to use them are the focus of the game, rather than the objects being built or the tasks being performed. To accentuate this point, the animated character guide may be - 8 - configured to play up the importance of the tool set. For example, an introductory dialog by the animated character guide may say something like, "Hi kids. I'm Joe, welcome to my workshop ... Wow! ... That's a might fine work bench you have there!", with the animated character guide appearing to look out of the display at the keyboard overlay as the guide discovers the work bench represented by the overlay.
The software also may permit the user to select the tools to be displayed in correspondence with the actuators of the overlay 200. For example, the user may be able to indicate that a sledgehammer should be displayed instead of a standard-sized hammer for use in hammering the nail 220. The simulated use of the tool may vary based on the image selected.
The software also may provide humorous results when the user uses the wrong tool in response to a prompt or uses a particular tool for an excessive amount of time. For example, overuse of the sander 230 may cause wood being sanded to burst into flames.
Referring to Fig. 4, a space-based playset device 300 is also in the form of a keyboard overlay. When a user actuates specific three-dimensional representational objects of the space-based playset device 300, one or more corresponding keys 130 of the keyboard 125 are pressed. The space-based playset device 300 includes a base 305 which is coupled to the three-dimensional representational objects. The three-dimensional representational objects of the space-based playset device 300 include a stationary platform 310, a toy gun 320, a first seat 330, a second seat 340, a third seat 350, a fire button 360, a pair of throttle levers 370, a navigational button 380, a first movable platform 390, and a second movable platform 395.
The first seat 330, second seat 340, and third seat 350 are configured to hold an action figure. Each seat causes a corresponding key 130 to be pressed when the seat is pushed downward toward the base 305. Typically, a seat is pushed downward to press a key when the user places an action figure in the seat. The first seat 330 generally is the gunner's seat and is occupied by a gunner action figure. Pressing this seat produces comments by the gunner or fires the phasers if the commander has asked the gunner to fire them. The first seat 330 is configured to rotate and is coupled to the toy gun 320 such that the toy gun 320 moves in response to rotation of the first seat 330.
The second seat 340 generally is occupied by a commander action figure. Pressing this seat produces comments or commands by the commander.
The third seat 350 generally is occupied by a first mate action figure. Pressing this seat produces the first mate's voice, arms the shields, or causes other actions depending on the situation.
Pressing the fire button 360 causes display of cannons being fired.
Movement of the pair of throttle levers 370 causes corresponding keys to be pressed. The software responds by causing the processor to "light speed" transition the display to a selected destination. The navigation orb 380 is used to select the destination.
The first moveable platform 390 and the second moveable platform 395 are capable of being moved by the user downward toward the base 305. The first movable platform 390 preferably provides a first station for a first robot action figure. Similarly, the second movable platform 395 preferably provides a second station for a second robot action figure. Downward movement of the platforms causes corresponding keys to be pressed. Movement of the second station produces comments by the second robot and a translation on the display. Movement of the first station produces comments or advice by the first robot.
The software for use in conjunction with the space-based playset device 300 combines "hands-on" action figure fantasy play with the richness and control of multimedia software. The software permits the user to pilot a detailed model of a space ship to explore a fantasy universe. The user can seek guidance from favorite characters, and can battle enemies. The software provides video clips associated with the characters represented by the action figures, permits the user to - 10 - explore new areas of the fantasy universe, and permits the user to interact with favorite characters.
Pressing on an actuator associated with a particular character causes the character to say lines in response to the action on the computer's display. Generally, these lines are non-repeating so that successively pressing down on one character will cause the character to say a sequence of related lines. Characters may respond to statements by previous characters or to situations on the screen. For example, if the commander says "First mate, put up the shields!", pressing the first mate may cause him to verbally reply and activate the shields. If the first robot is pressed before the first mate, he might say "Oh please hurry first mate."
The user can operate the navigation and other controls to simulate travel to dozens of destinations in the fantasy universe. Commands from the commander guide the user by instructing the user when to fire weapons or take other actions.
Other embodiments are within the scope of the following claims.

Claims

- 11 -What is claimed is:
1. A computer-implemented method of encouraging interaction between a user and a computer, the method comprising: providing an input device including actuators formed as three- dimensional representations of physical objects, receiving an input signal generated by actuation of a particular actuator by the user, and providing feedback to the user, the feedback being consistent with the particular physical object represented by the particular actuator.
2. The method of claim 1, wherein providing feedback to the user comprises displaying a video sequence representative of activity by the particular physical object.
3. The method of claim 2, wherein the video sequence includes an image of the particular physical object and represents a change in a state of the particular physical object.
4. The method of claim 3, wherein providing feedback to the user further comprises generating sound representative of the change in the state of the particular physical object, the sound being synchronized with the change in state in the video sequence.
5. The method of claim 1, wherein providing feedback to the user further comprises generating sound representative of the particular physical object.
6. The method of claim 5, wherein the sound represents sound generated by the particular physical object. - 12 -
7. The method of claim 6, wherein the sound represents sound generated during use of the particular physical object.
8. The method of claim 6, wherein the particular physical object represents a character and the sound represents speech by the character.
9. The method of claim 1, further comprising displaying an image of the particular physical object prior to actuation of the actuator, wherein providing feedback to the user further comprises modifying the image of the particular physical object to represent a change caused by actuation of the particular actuator.
10. The method of claim 9, further comprising prompting the user to actuate the particular actuator.
11. The method of claim 10, further comprising prompting the user to actuate the particular actuator by generating a spoken prompt.
12. The method of claim 11, wherein the spoken prompt does not specifically instruct the user to actuate the particular actuator.
13. The method of claim 12, wherein the particular actuator represents a character and the spoken prompt tells the character to perform an action.
14. The method of claim 11, further displaying an animated character, wherein generating a spoken prompt further comprises generating the spoken prompt in a voice of the animated character and animating a face of the animated character to simulate speech by the character. - 13 -
15. The method of claim 14, further comprising generating further spoken prompts by the animated character to guide the user through use of different actuators of the input device.
16. The method of claim 9, further comprising permitting the user to modify the image of the particular physical object.
17. The method of claim 16, wherein permitting the user to modify the image of the particular physical object comprises permitting the user to select an image for the particular physical object from a set of images related to the particular physical object.
18. The method of claim 1, wherein: the computer includes a keyboard having keys, and providing the input device comprises providing a keyboard overlay including actuators formed as three-dimensional representations of physical objects, the keyboard overlay being configured such that actuation of an actuator presses at least one key of the keyboard.
19. The method of claim 18, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of a set of related physical objects.
20. The method of claim 19, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of a set of tools.
21. The method of claim 20, wherein providing feedback to the user comprises displaying imagery indicating progress in completion of a project using the set of tools. - 14 -
22. The method of claim 21, wherein providing feedback to the user comprises generating sounds associated with use of a tool represented by the particular actuator.
23. The method of claim 20, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of a set of woodworking tools.
24. The method of claim 23, wherein providing a keyboard overlay includes providing a keyboard overlay having an actuator formed as a three- dimensional representation of a saw.
25. The method of claim 18, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of objects in a play environment.
26. The method of claim 25, wherein providing feedback to the user comprises displaying imagery indicating progress in completion of an activity in the play environment.
27. The method of claim 26, wherein providing feedback to the user comprises generating sounds associated with the particular physical object.
28. The method of claim 18, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of characters in a play environment.
29. The method of claim 28, wherein providing feedback to the user comprises generating speech in a voice associated with a character represented by the particular actuator. - 15 -
30. The method of claim 18, wherein providing a keyboard overlay includes providing a keyboard overlay having an unsecured component that is readily removable from the keyboard overlay.
31. The method of claim 30, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of objects in a play environment and an unsecured component in the form of a character in the play environment.
32. The method of claim 30, wherein providing a keyboard overlay includes providing a keyboard overlay having an unsecured component for use in actuating a certain actuator and being representative of an object normally used to manipulate an object represented by the certain actuator.
33. The method of claim 32, wherein the unsecured component is representative of a hammer and the certain actuator is representative of a nail.
34. The method of claim 32, wherein the unsecured component is representative of a screwdriver and the certain actuator is representative of a screw.
35. The method of claim 1, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of a set of related physical objects.
36. The method of claim 35, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of a set of tools. - 16 -
37. The method of claim 36, wherein providing feedback to the user comprises displaying imagery indicating progress in completion of a project using the set of tools.
38. The method of claim 37, wherein providing feedback to the user comprises generating sounds associated with use of a tool represented by the particular actuator.
39. The method of claim 36, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of a set of woodworking tools.
40. The method of claim 39, wherein providing an input device includes providing an input device having an actuator formed as a three- dimensional representation of a saw.
41. The method of claim 1, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of objects in a play environment.
42. The method of claim 41, wherein providing feedback to the user comprises displaying imagery indicating progress in completion of an activity in the play environment.
43. The method of claim 42, wherein providing feedback to the user comprises generating sounds associated with the particular object.
44. The method of claim 1, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of characters in a play environment. - 17 -
45. The method of claim 44, wherein providing feedback to the user comprises generating speech in a voice associated with a character represented by the particular actuator.
46. The method of claim 1, wherein providing an input device includes providing an input device having an unsecured component that is readily removable from the input device.
47. The method of claim 46, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of objects in a play environment and an unsecured component in the form of a character in the play environment.
48. The method of claim 46, wherein providing an input device includes providing an input device having an unsecured component for use in actuating a certain actuator and being representative of an object normally used to manipulate an object represented by the certain actuator.
49. The method of claim 48, wherein the unsecured component is representative of a hammer and the certain actuator is representative of a nail.
50. The method of claim 48, wherein the unsecured component is representative of a screwdriver and the certain actuator is representative of a screw.
51. The method of claim 1 , wherein the actuation of the particular actuator causes the particular actuator to move in a way that mimics motion normally associated with the particular physical object. - 18 -
52. The method of claim 51, wherein the particular physical object comprises a screw, the particular actuator is formed as a three-dimensional representation of a screw, and actuating the actuator comprises turning the actuator.
53. A computer system designed to encourage interaction between a user and the computer system, the computer system comprising: a processor, a display connected to receive signals from the processor and to display video images in response to the signals, an input device including actuators formed as three-dimensional representations of physical objects, software running on the processor and operable to: cause the processor to receive an input signal generated by actuation of a particular actuator by the user, and provide feedback to the user, the feedback being consistent with the particular physical object represented by the particular actuator.
54. A computer program, residing on a computer readable medium, for a computer system comprising a processor, an input device including actuators formed as three-dimensional representations of physical objects, and a display, the computer program comprising instructions for encouraging interaction between a user and the computer by causing the processor to perform the following operations: receive an input signal generated by actuation of a particular actuator by the user, and provide feedback to the user, the feedback being consistent with the particular physical object represented by the particular actuator.
PCT/US1999/002420 1998-02-04 1999-02-04 Interaction between software and an input device WO1999039789A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
NZ506607A NZ506607A (en) 1998-02-04 1999-02-04 Interaction between software and input device
EP99904579A EP1154824A4 (en) 1998-02-04 1999-02-04 Interaction between software and an input device
AU24950/99A AU746932B2 (en) 1998-02-04 1999-02-04 Interaction between software and an input device
CA002320078A CA2320078A1 (en) 1998-02-04 1999-02-04 Interaction between software and an input device

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US7362298P 1998-02-04 1998-02-04
US60/073,622 1998-02-04
US09/018,691 US6322449B1 (en) 1998-02-04 1998-02-04 Mechanical interface device
US09/018,691 1998-02-04
US09/019,489 US5992817A (en) 1998-02-04 1998-02-04 Keyboard interface device
US09/019,489 1998-02-04

Publications (1)

Publication Number Publication Date
WO1999039789A1 true WO1999039789A1 (en) 1999-08-12

Family

ID=27361071

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/002420 WO1999039789A1 (en) 1998-02-04 1999-02-04 Interaction between software and an input device

Country Status (5)

Country Link
EP (1) EP1154824A4 (en)
AU (1) AU746932B2 (en)
CA (1) CA2320078A1 (en)
NZ (1) NZ506607A (en)
WO (1) WO1999039789A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5717423A (en) * 1994-12-30 1998-02-10 Merltec Innovative Research Three-dimensional display
US5805138A (en) * 1995-06-07 1998-09-08 International Business Machines Corporation Gross motion input controller for a computer system
US5818420A (en) * 1996-07-31 1998-10-06 Nippon Hoso Kyokai 3D object graphics display device, 3D object graphics display method, and manipulator for 3D object graphics display

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4704940A (en) * 1984-09-05 1987-11-10 Cummings Darold B Computer keyboard adaptor
DE3838362C1 (en) * 1988-11-11 1990-01-11 Nixdorf Computer Ag, 4790 Paderborn, De
US5667319A (en) * 1995-03-17 1997-09-16 Satloff; James Simplified computer keyboard
DE19606467A1 (en) * 1996-02-21 1997-08-28 Norbert Lorenz Keyboard operating method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5717423A (en) * 1994-12-30 1998-02-10 Merltec Innovative Research Three-dimensional display
US5805138A (en) * 1995-06-07 1998-09-08 International Business Machines Corporation Gross motion input controller for a computer system
US5818420A (en) * 1996-07-31 1998-10-06 Nippon Hoso Kyokai 3D object graphics display device, 3D object graphics display method, and manipulator for 3D object graphics display

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1154824A4 *

Also Published As

Publication number Publication date
NZ506607A (en) 2002-04-26
EP1154824A1 (en) 2001-11-21
CA2320078A1 (en) 1999-08-12
AU746932B2 (en) 2002-05-09
EP1154824A4 (en) 2007-06-20
AU2495099A (en) 1999-08-23

Similar Documents

Publication Publication Date Title
US6464585B1 (en) Sound generating device and video game device using the same
US7507158B2 (en) Force feedback mechanism for gamepad device
US6811491B1 (en) Interactive video game controller adapter
Orozco et al. The role of haptics in games
EP1208887A2 (en) Object control method
US20020004423A1 (en) Manual operating device, game apparatus using the same, game method and computer readable medium
WO2008057864A2 (en) Interfacing with virtual reality
JP2002011243A (en) Storage medium, program, entertainment system and entertainment device
WO1999062605A1 (en) Recorded medium and entertainment system
Sreedharan et al. 3D input for 3D worlds
Parisi Game interfaces as bodily techniques
Ouhyoung et al. A low-cost force feedback joystick and its use in PC video games
Wang et al. JetController: High-speed ungrounded 3-DoF force feedback controllers using air propulsion jets
Shim et al. FS-Pad: Video game interactions using force feedback gamepad
Flemming et al. How to take a brake from embodied locomotion–seamless status control methods for seated leaning interfaces
US6322449B1 (en) Mechanical interface device
Chang Haptics: gaming's new sensation
Thorpe et al. History and alternative game input methods
AU746932B2 (en) Interaction between software and an input device
Ntokos Techniques on multiplatform movement and interaction systems in a virtual reality context for games
Hendricks et al. EEG: the missing gap between controllers and gestures
Cimolino et al. Automation Confusion: A Grounded Theory of Non-Gamers’ Confusion in Partially Automated Action Games
KR102146375B1 (en) Platform system for robot simulator based on mixed reality
Ouhyoung et al. The development of a low-cost force feedback joystick and its use in the virtual environment
Rodriguez Baquero et al. MechVR post-mortem-Navigating design challenges for a VR game

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA DE GB JP NZ

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 24950/99

Country of ref document: AU

ENP Entry into the national phase

Ref document number: 2320078

Country of ref document: CA

Ref country code: CA

Ref document number: 2320078

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 506607

Country of ref document: NZ

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWE Wipo information: entry into national phase

Ref document number: 1999904579

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1999904579

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 24950/99

Country of ref document: AU