AU746932B2 - Interaction between software and an input device - Google Patents
Interaction between software and an input device Download PDFInfo
- Publication number
- AU746932B2 AU746932B2 AU24950/99A AU2495099A AU746932B2 AU 746932 B2 AU746932 B2 AU 746932B2 AU 24950/99 A AU24950/99 A AU 24950/99A AU 2495099 A AU2495099 A AU 2495099A AU 746932 B2 AU746932 B2 AU 746932B2
- Authority
- AU
- Australia
- Prior art keywords
- providing
- user
- actuator
- input device
- physical object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0202—Constructional details or processes of manufacture of the input device
- G06F3/0205—Lever arrangements for operating keyboard cursor control keys in a joystick-like manner
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/22—Setup operations, e.g. calibration, key configuration or button assignment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
- A63F13/245—Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/54—Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0202—Constructional details or processes of manufacture of the input device
- G06F3/0219—Special purpose keyboards
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/06—Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1018—Calibration; Key and button assignment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1062—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to a type of game, e.g. steering wheel
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6063—Methods for processing data by generating or executing the game program for sound processing
- A63F2300/6081—Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Description
Q:\OPERGCPV495c..doc-28I12I02 -1- INTERACTION BETWEEN SOFTWARE AND AN INPUT DEVICE TECHNICAL FIELD The invention relates to interactions between software, particularly software games, and an input device such as a mechanical keyboard overlay.
BACKGROUND
Computer keyboards are well known. Typically, a standard computer keyboard is o: connected to a computer and used as an input device to efficiently enter data. For those people who possess the requisite dexterity and typing skills, standard computer keyboards *•:are well suited for quickly entering text. However, the standard computer keyboard is not well suited for young children with limited dexterity or those not skilled at typing. Nor is a standard keyboard adept at engaging the imagination of young children or others.
As an alternative to the standard keyboard, the use of a peripheral input device, 15 such as ajoystick, mouse or trackball, is well known. Although these peripheral input devices offer alternatives to the standard computer keyboard, they may be costly to purchase and maintain because of their inherent electronic complexity. Mechanical joystick devices which mechanically couple to an underlying keyboard are also well known. These typically are simpler than their electronic counterparts.
SUMMARY
According to the present invention there is provided a computer-implemented method of encouraging interaction between a user and a computer, the method comprising: providing an input device including actuators formed as three-dimensional representations of physical objects, receiving an input signal generated by actuation of a particular actuator by the user, and providing feedback to the user, the feedback being consistent with the particular physical object represented by the particular actuator.
Q:\OPER\GCP495(0c.d2o2X1)2/)2 -1A- The invention also provides a computer system designed to encourage interaction between a user and the computer system, the computer system comprising: a processor, a display connected to receive signals from the processor and to display video images in response to the signals, an input device including actuators formed as three-dimensional representations of physical objects, software running on the processor and operable to: cause the processor to receive an input signal generated by actuation of a particular actuator by the user, and provide feedback to the user, the feedback being consistent with the S-particular physical object represented by the particular actuator.
The invention also provides a computer program, residing on a computer readable medium, for a computer system comprising a processor, an input device including 15 actuators formed as three-dimensional representations of physical objects, and a display, the computer program comprising instructions for encouraging interaction between a user and the computer by causing the processor to perform the following operations: receive an input signal generated by actuation of a particular actuator by the user, and provide feedback to the user, the feedback being consistent with the particular physical object represented by the particular actuator.
The computer may have an input device, such as a keyboard overlay, that includes actuators formed as three-dimensional representations of physical objects. For example, the actuators may represent a set of tools or objects in a play environment. The techniques extend the fantasy and imaginative play normally associated with multi-media software to the input devices used to control the software. This promises to substantially increase the entertainment value of the WO 99/39789 PCT/US99/02420 -2software, particularly for younger children who are not adept at using a keyboard or other traditional input device.
In one general aspect, interaction between a user and a computer is encouraged through use of an input device, such as a keyboard overlay, including actuators formed as three-dimensional representations of physical objects. Software running on the computer causes the computer to receive an input signal generated by actuation of a particular actuator by the user, and to provide feedback to the user. The feedback is consistent with the particular physical object represented by the particular actuator.
Implementations may include one or more of the following features.
For example, providing feedback to the user may include displaying a video sequence representative of activity by the particular physical object. The video sequence may include an image of the particular physical object and may represent a change in a state of the particular physical object.
Providing feedback to the user may include generating sound representative of the particular physical object. The sound may be representative of the change in the state of the particular physical object, and may be synchronized with the change in state in the video sequence. The sound may represent sound generated by the particular physical object, such as sound generated during use of the particular physical object. For example, when the particular physical object is a saw, the sound may represent the saw cutting wood. When the particular physical object represents a character, the sound may represent speech by the character.
The software may cause the computer to display an image of the particular physical object prior to actuation of the particular actuator. Feedback provided to the user may include modifying the image of the particular physical object to represent a change caused by actuation of the particular actuator.
The software may cause the computer to prompt the user to actuate the particular actuator by, for example, generating a spoken prompt. The spoken prompt does not necessarily specifically instruct the user to actuate the particular WO 99/39789 PCTIUS99/02420 -3actuator. For example, when the particular actuator represents a character, the spoken prompt may tell the character to perform an action.
The software may cause the computer to display an animated character, and to generate the spoken prompts in the voice of the animated character. The software may further cause the computer to animate a face of the animated character to simulate speech by the character. The software may generate spoken prompts by the animated character to guide the user through use of different actuators of the input device. For example, when the actuators represent a set of tools, the spoken prompts by the animated character may guide the user through a project using the tools.
The software may permit the user to modify the image of the particular physical object. For example, the software may permit the user to select an image for the particular physical object from a set of images related to the particular physical object. Thus, when a particular actuator represents a saw, the software may permit the user to select an image from a set of images of different kinds of saws.
The input device may be a keyboard overlay including actuators formed as three-dimensional representations of physical objects. The keyboard overlay is configured such that actuation of an actuator presses at least one key of the keyboard.
The input device may have actuators formed as three-dimensional representations of a set of related physical objects, such as a set of tools. The software may cause the computer to display imagery indicating progress in completion of a project using the set of tools. Providing feedback to the user may include generating sounds associated with use of a tool represented by the particular actuator.
The tools may be, for example, a set of woodworking tools. The set of woodworking tools may include, for example, representations of a saw, a drill, a sander, a sprayer, a screwdriver and screw, and a hammer and nail.
WO 99/39789 PCT/US99/02420 -4- The actuators also may represent objects in a play environment.
Feedback provided to the user may include imagery indicating progress in completion of an activity in the play environment. Feedback also may include generating sounds associated with an object represented by the actuator actuated by the user to produce the input signal. For example, when the objects are characters in the play environment, the feedback to the user may include speech in a voice associated with a character represented by the actuator.
The input device may have one or more unsecured components that are readily removable from the input device. For example, when the actuators represent objects in a play environment, an unsecured component may be in the form of a character in the play environment.
The unsecured component may be for use in actuating a certain actuator, and may be representative of an object normally used to manipulate an object represented by the actuator. For example, the unsecured component may be representative of a hammer or screwdriver and the associated actuator may be representative of a nail or screw.
An actuator may move in a way that mimics motion normally associated with a corresponding physical object. For example, when the corresponding object is a screw, and the actuator is formed as a three-dimensional representation of a screw, actuating the actuator may include turning the actuator.
The techniques may be implemented in computer hardware or software, or a combination of the two. However, the techniques are not limited to any particular hardware or software configuration; they may find applicability in any computing or processing environment that may be used for interactive games or other interactive activities. Preferably, the techniques are implemented in computer programs executing on programmable computers that each include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices. Program code is applied to data entered using the input device to perform WO 99/39789 PCT/US99/02420 the functions described and to generate output information. The output information is applied to the one or more output devices.
Each program is preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system.
However, the programs can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language.
Each such computer program is preferably stored on a storage medium or device CD-ROM, hard disk or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer to perform the procedures described in this document. The system may also be considered to be implemented as a computer-readable storage medium, configured with a computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner.
Other features and advantages will be apparent from the following description, including the drawings, and from the claims.
DESCRIPTION OF DRAWINGS Fig. 1 is a block diagram of a keyboard overlay.
Figs. 2 and 3 are, respectively, left and right perspective views of a keyboard overlay that provides a set of tools.
Fig. 4 is a perspective view of a keyboard overlay that provides a play environment.
DETAILED DESCRIPTION Referring to Fig. 1, a computer system 100 includes a computer 105 having a processor, and a display 110 coupled to the computer. Software running on the processor causes the processor to transmit video images to the display 110 for display by the display 110. The software also causes the processor to transmit electrical signals to a set of speakers 115 attached to the computer. The electrical WO 99/39789 PCT/US99/02420 -6signals cause the speakers 115 to generate sounds in conjunction with the video images. The software may be loaded into the computer using a computer-readable medium inserted into a disk drive 120 of the computer.
A keyboard 125 is also coupled to the computer. The keyboard provides the processor with input signalsin response to actuation of 130 of the keyboard. The particular signal provided to the processor corresponds to the particular key pressed by a user of the computer. The software causes the computer to generate the video images and associated sounds in response to signals from the keyboard 125.
A keyboard overlay 135 attaches to the keyboard 125. The overlay 135 includes actuators 140 that provide three-dimensional representations of physical objects. Mechanisms within the overlay are coupled to the actuators 140 such that actuation of an actuator causes one or more keys 130 on the keyboard to be pressed. For example, an actuator 140 may constitute a toy saw that is manipulated like a real saw.
Referring to Figs. 2 and 3, a keyboard overlay 200 provides a set of actuators shaped to represent a set of tools. The overlay 200 is used in conjunction with software that permits a user (typically a young child) to manipulate the tools to build projects or take part in other activities which simulate use of the tools through audio-visual sequences produced by the computer for display and output by the display and the speakers.
The overlay 200 includes a housing 205 and a base 210. The overlay 200 is configured to fit on top of an underlying keyboard such as the keyboard 125 shown in Fig. 1. More particularly, the base 210 engages the keyboard 125 to couple the overlay 200 to the keyboard 125 in a known orientation. A strap (not shown) or other mechanism may be used to secure the overlay to the keyboard.
The housing 200 includes several actuators in the form of threedimensional, representational objects. The actuators include a saw 215, a screw 220, a nail 225, a sander 230, a sprayer 235, and a drill 240. The housing 220 also includes a screwdriver 245 and a hammer 250 that are removable from the housing WO 99/39789 PCT/US99/02420 -7for use in manipulating, respectively, the screw 220 and the nail 225. A button 255 is also provided for use in controlling the computer. For example, the button 255 may be used to activate a help function of the software running on the computer.
The actuators are all three-dimensional representational toy objects that appear realistic to the user and function in a manner similar to the objects they represent. For example, the screwdriver 245 may be used to turn the screw 220 and the hammer 250 may be used to pound in the nail 225. Similarly, the saw 215 is actuated by a horizontal forwards-and-backwards sawing motion, the sander 230 is actuated by a horizontal side-to-side sanding motion, the sprayer 235 is actuated by pressing a trigger 260 of the sprayer, and the drill 240 is actuated by turning a handle 265 of the drill.
Actuation of an actuator causes a corresponding key on the keyboard to be pressed. Typically, the software responds by causing the computer to display an image of the object represented by the actuator being used and to produce an associated sound. For example, pressing the trigger 260 of the sprayer 235 causes the computer to display movement of a displayed trigger of a displayed image of the sprayer, to display paint spraying from the sprayer and onto a surface of a project being undertake, and to produce the sound of paint spraying.
The software running on the computer 105, in conjunction with the keyboard overlay 200, provides "hands on" fun for children or other users of the computer. An animated character displayed by the computer guides the user through a variety of projects and games using the tools of the keyboard overlay.
The animated character does so by prompting the user as to what tool should be used to perform a particular task.
The software and associated overlay permits the user to physically interact with the virtual world provided on the computer display. Because of the interaction with the keyboard overlay, the tools and the physical action required to use them are the focus of the game, rather than the objects being built or the tasks being performed. To accentuate this point, the animated character guide may be WO 99/39789 PCT/US99/02420 -8configured to play up the importance of the tool set. For example, an introductory dialog by the animated character guide may say something like, "Hi kids. I'm Joe, welcome to my workshop Wow! That's a might fine work bench you have there!", with the animated character guide appearing to look out of the display at the keyboard overlay as the guide discovers the work bench represented by the overlay.
The software also may permit the user to select the tools to be displayed in correspondence with the actuators of the overlay 200. For example, the user may be able to indicate that a sledgehammer should be displayed instead of a standard-sized hammer for use in hammering the nail 220. The simulated use of the tool may vary based on the image selected.
The software also may provide humorous results when the user uses the wrong tool in response to a prompt or uses a particular tool for an excessive amount of time. For example, overuse of the sander 230 may cause wood being sanded to burst into flames.
Referring to Fig. 4, a space-based playset device 300 is also in the form of a keyboard overlay. When a user actuates specific three-dimensional representational objects of the space-based playset device 300, one or more corresponding keys 130 of the keyboard 125 are pressed. The space-based playset device 300 includes a base 305 which is coupled to the three-dimensional representational objects. The three-dimensional representational objects of the space-based playset device 300 include a stationary platform 310, a toy gun 320, a first seat 330, a second seat 340, a third seat 350, a fire button 360, a pair of throttle levers 370, a navigational button 380, a first movable platform 390, and a second movable platform 395.
The first seat 330, second seat 340, and third seat 350 are configured to hold an action figure. Each seat causes a corresponding key 130 to be pressed when the seat is pushed downward toward the base 305. Typically, a seat is pushed downward to press a key when the user places an action figure in the seat.
WO 99/39789 PCT/US99/02420 -9- The first seat 330 generally is the gunner's seat and is occupied by a gunner action figure. Pressing this seat produces comments by the gunner or fires the phasers if the commander has asked the gunner to fire them. The first seat 330 is configured to rotate and is coupled to the toy gun 320 such that the toy gun 320 moves in response to rotation of the first seat 330.
The second seat 340 generally is occupied by a commander action figure. Pressing this seat produces comments or commands by the commander.
The third seat 350 generally is occupied by a first mate action figure.
Pressing this seat produces the first mate's voice, arms the shields, or causes other actions depending on the situation.
Pressing the fire button 360 causes display of cannons being fired.
Movement of the pair of throttle levers 370 causes corresponding keys to be pressed. The software responds by causing the processor to "light speed" transition the display to a selected destination. The navigation orb 380 is used to select the destination.
The first moveable platform 390 and the second moveable platform 395 are capable of being moved by the user downward toward the base 305. The first movable platform 390 preferably provides a first station for a first robot action figure. Similarly, the second movable platform 395 preferably provides a second station for a second robot action figure. Downward movement of the platforms causes corresponding keys to be pressed. Movement of the second station produces comments by the second robot and a translation on the display. Movement of the first station produces comments or advice by the first robot.
The software for use in conjunction with the space-based playset device 300 combines "hands-on" action figure fantasy play with the richness and control of multimedia software. The software permits the user to pilot a detailed model of a space ship to explore a fantasy universe. The user can seek guidance from favorite characters, and can battle enemies. The software provides video clips associated with the characters represented by the action figures, permits the user to Q.\OPER\GCPG .4935c.doc-2H/02A;2 explore new areas of the fantasy universe, and permits the user to interact with favourite characters.
Pressing on an actuator associated with a particular character causes the character to say lines in response to the action on the computer's display. Generally, these lines are non-repeating so that successively pressing down on one character will cause the character to say a sequence of related lines. Characters may respond to statements by previous characters or to situations on the screen. For example, if the commander says "First mate, put up the shields!", pressing the first mate may cause him to verbally reply and activate the shields. If the first robot is pressed before the first mate, he might say "Oh please hurry first mate".
The user can operate the navigation and other controls to simulate travel to dozens of destinations in the fantasy universe. Commands from the commander guide the user by S.instructing the user when to fire weapons or take other actions.
Other embodiments are within the scope of the following claims.
Throughout this specification and the claims which follow, unless the context requires otherwise, the word "comprise", and variations such as "comprises" and "comprising", will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or Ssteps.
steps.
Claims (45)
1. A computer-implemented method of encouraging interaction between a user and a computer, the method comprising: providing an input device including actuators formed as three- dimensional representations of physical objects, receiving an input signal generated by actuation of a particular actuator by the user, and providing feedback to the user, the feedback being consistent with the particular physical object represented by the particular actuator.
2. The method of claim 1, wherein providing feedback to the user comprises displaying a video sequence representative of activity by the particular physical object.
3. The method of claim 2, wherein the video sequence includes an image of the particular physical object and represents a change in a state of the particular physical object.
4. The method of claim 3, wherein providing feedback to the user further comprises generating sound representative of the change in the state of the particular physical object, the sound being synchronized with the change in state in the video sequence.
5. The method of claim 1, wherein providing feedback to the user further comprises generating sound representative of the particular physical object.
6. The method of claim 5, wherein the sound represents sound generated by the particular physical object. WO 99/39789 PCT/US99/02420
12- 7. The method of claim 6, wherein the sound represents sound generated during use of the particular physical object. 8. The method of claim 6, wherein the particular physical object represents a character and the sound represents speech by the character. 9. The method of claim 1, further comprising displaying an image of the particular physical object prior to actuation of the actuator, wherein providing feedback to the user further comprises modifying the image of the particular physical object to represent a change caused by actuation of the particular actuator. The method of claim 9, further comprising prompting the user to actuate the particular actuator. 11. The method of claim 10, further comprising prompting the user to actuate the particular actuator by generating a spoken prompt. 12. The method of claim 11, wherein the spoken prompt does not specifically instruct the user to actuate the particular actuator.
13. The method of claim 12, wherein the particular actuator represents a character and the spoken prompt tells the character to perform an action.
14. The method of claim 11, further displaying an animated character, wherein generating a spoken prompt further comprises generating the spoken prompt in a voice of the animated character and animating a face of the animated character to simulate speech by the character. WO 99/39789 PCT/US99/02420 -13- The method of claim 14, further comprising generating further spoken prompts by the animated character to guide the user through use of different actuators of the input device.
16. The method of claim 9, further comprising permitting the user to modify the image of the particular physical object.
17. The method of claim 16, wherein permitting the user to modify the image of the particular physical object comprises permitting the user to select an image for the particular physical object from a set of images related to the particular physical object.
18. The method of claim 1, wherein: the computer includes a keyboard having keys, and providing the input device comprises providing a keyboard overlay including actuators formed as three-dimensional representations of physical objects, the keyboard overlay being configured such that actuation of an actuator presses at least one key of the keyboard.
19. The method of claim 18, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of a set of related physical objects. The method of claim 19, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of a set of tools.
21. The method of claim 20, wherein providing feedback to the user comprises displaying imagery indicating progress in completion of a project using the set of tools. WO 99/39789 PCT/US99/02420 14-
22. The method of claim 21, wherein providing feedback to the user comprises generating sounds associated with use of a tool represented by the particular actuator.
23. The method of claim 20, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of a set of woodworking tools.
24. The method of claim 23, wherein providing a keyboard overlay includes providing a keyboard overlay having an actuator formed as a three- dimensional representation of a saw.
25. The method of claim 18, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of objects in a play environment.
26. The method of claim 25, wherein providing feedback to the user comprises displaying imagery indicating progress in completion of an activity in the play environment.
27. The method of claim 26, wherein providing feedback to the user comprises generating sounds associated with the particular physical object.
28. The method of claim 18, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of characters in a play environment.
29. The method of claim 28, wherein providing feedback to the user comprises generating speech in a voice associated with a character represented by the particular actuator. WO 99/39789 PCT/US99/02420 The method of claim 18, wherein providing a keyboard overlay includes providing a keyboard overlay having an unsecured component that is readily removable from the keyboard overlay.
31. The method of claim 30, wherein providing a keyboard overlay includes providing a keyboard overlay having actuators formed as three- dimensional representations of objects in a play environment and an unsecured component in the form of a character in the play environment.
32. The method of claim 30, wherein providing a keyboard overlay includes providing a keyboard overlay having an unsecured component for use in actuating a certain actuator and being representative of an object normally used to manipulate an object represented by the certain actuator.
33. The method of claim 32, wherein the unsecured component is representative of a hammer and the certain actuator is representative of a nail.
34. The method of claim 32, wherein the unsecured component is representative of a screwdriver and the certain actuator is representative of a screw. The method of claim 1, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of a set of related physical objects.
36. The method of claim 35, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of a set of tools. WO 99/39789 PCT/US99/02420 16-
37. The method of claim 36, wherein providing feedback to the user comprises displaying imagery indicating progress in completion of a project using the set of tools.
38. The method of claim 37, wherein providing feedback to the user comprises generating sounds associated with use of a tool represented by the particular actuator.
39. The method of claim 36, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of a set of woodworking tools.
40. The method of claim 39, wherein providing an input device includes providing an input device having an actuator formed as a three- dimensional representation of a saw.
41. The method of claim 1, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of objects in a play environment.
42. The method of claim 41, wherein providing feedback to the user comprises displaying imagery indicating progress in completion of an activity in the play environment.
43. The method of claim 42, wherein providing feedback to the user comprises generating sounds associated with the particular object.
44. The method of claim 1, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of characters in a play environment. WO 99/39789 PCT/US99/02420 -17- The method of claim 44, wherein providing feedback to the user comprises generating speech in a voice associated with a character represented by the particular actuator.
46. The method of claim 1, wherein providing an input device includes providing an input device having an unsecured component that is readily removable from the input device.
47. The method of claim 46, wherein providing an input device includes providing an input device having actuators formed as three-dimensional representations of objects in a play environment and an unsecured component in the form of a character in the play environment.
48. The method of claim 46, wherein providing an input device includes providing an input device having an unsecured component for use in actuating a certain actuator and being representative of an object normally used to manipulate an object represented by the certain actuator.
49. The method of claim 48, wherein the unsecured component is representative of a hammer and the certain actuator is representative of a nail. The method of claim 48, wherein the unsecured component is representative of a screwdriver and the certain actuator is representative of a screw.
51. The method of claim 1, wherein the actuation of the particular actuator causes the particular actuator to move in a way that mimics motion normally associated with the particular physical object. WO 99/39789 PCT/US99/02420 18
52. The method of claim 51, wherein the particular physical object comprises a screw, the particular actuator is formed as a three-dimensional representation of a screw, and actuating the actuator comprises turning the actuator.
53. A computer system designed to encourage interaction between a user and the computer system, the computer system comprising: a processor, a display connected to receive signals from the processor and to display video images in response to the signals, an input device including actuators formed as three-dimensional representations of physical objects, software running on the processor and operable to: cause the processor to receive an input signal generated by actuation of a particular actuator by the user, and provide feedback to the user, the feedback being consistent with the particular physical object represented by the particular actuator.
54. A computer program, residing on a computer readable medium, for a computer system comprising a processor, an input device including actuators formed as three-dimensional representations of physical objects, and a display, the computer program comprising instructions for encouraging interaction between a user and the computer by causing the processor to perform the following operations: receive an input signal generated by actuation of a particular actuator by the user, and provide feedback to the user, the feedback being consistent with the particular physical object represented by the particular actuator. Q:\OPER\GCP\24950c.doc-28/12/12 -19- A computer-implemented method of encouraging interaction between a user and a computer substantially as hereinbefore described with reference to the accompanying drawings.
56. A computer system designed to encourage interaction between a user and the computer system substantially as hereinbefore described with reference to the accompanying drawings.
57. A computer program, residing on a computer readable medium, for a computer system substantially as hereinbefore described with reference to the accompanying drawings. DATED this 28th day of February, 2002 KLITSNER INDUSTRIAL DESIGN, LLC By its Patent Attorneys DAVIES COLLISON CAVE 0 '0000 0.00 0
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US7362298P | 1998-02-04 | 1998-02-04 | |
US09/018691 | 1998-02-04 | ||
US09/018,691 US6322449B1 (en) | 1998-02-04 | 1998-02-04 | Mechanical interface device |
US09/019489 | 1998-02-04 | ||
US09/019,489 US5992817A (en) | 1998-02-04 | 1998-02-04 | Keyboard interface device |
US60/073622 | 1998-02-04 | ||
PCT/US1999/002420 WO1999039789A1 (en) | 1998-02-04 | 1999-02-04 | Interaction between software and an input device |
Publications (2)
Publication Number | Publication Date |
---|---|
AU2495099A AU2495099A (en) | 1999-08-23 |
AU746932B2 true AU746932B2 (en) | 2002-05-09 |
Family
ID=27361071
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU24950/99A Ceased AU746932B2 (en) | 1998-02-04 | 1999-02-04 | Interaction between software and an input device |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP1154824A4 (en) |
AU (1) | AU746932B2 (en) |
CA (1) | CA2320078A1 (en) |
NZ (1) | NZ506607A (en) |
WO (1) | WO1999039789A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5204511A (en) * | 1988-11-11 | 1993-04-20 | Siemans Nixdorf Informationssysteme AG | Alphanumeric keyboard |
US5667319A (en) * | 1995-03-17 | 1997-09-16 | Satloff; James | Simplified computer keyboard |
US5717423A (en) * | 1994-12-30 | 1998-02-10 | Merltec Innovative Research | Three-dimensional display |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4704940A (en) * | 1984-09-05 | 1987-11-10 | Cummings Darold B | Computer keyboard adaptor |
US5805138A (en) * | 1995-06-07 | 1998-09-08 | International Business Machines Corporation | Gross motion input controller for a computer system |
DE19606467A1 (en) * | 1996-02-21 | 1997-08-28 | Norbert Lorenz | Keyboard operating method |
US5818420A (en) * | 1996-07-31 | 1998-10-06 | Nippon Hoso Kyokai | 3D object graphics display device, 3D object graphics display method, and manipulator for 3D object graphics display |
-
1999
- 1999-02-04 EP EP99904579A patent/EP1154824A4/en not_active Withdrawn
- 1999-02-04 NZ NZ506607A patent/NZ506607A/en not_active IP Right Cessation
- 1999-02-04 WO PCT/US1999/002420 patent/WO1999039789A1/en active IP Right Grant
- 1999-02-04 CA CA002320078A patent/CA2320078A1/en not_active Abandoned
- 1999-02-04 AU AU24950/99A patent/AU746932B2/en not_active Ceased
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5204511A (en) * | 1988-11-11 | 1993-04-20 | Siemans Nixdorf Informationssysteme AG | Alphanumeric keyboard |
US5717423A (en) * | 1994-12-30 | 1998-02-10 | Merltec Innovative Research | Three-dimensional display |
US5667319A (en) * | 1995-03-17 | 1997-09-16 | Satloff; James | Simplified computer keyboard |
Also Published As
Publication number | Publication date |
---|---|
NZ506607A (en) | 2002-04-26 |
EP1154824A1 (en) | 2001-11-21 |
AU2495099A (en) | 1999-08-23 |
WO1999039789A1 (en) | 1999-08-12 |
EP1154824A4 (en) | 2007-06-20 |
CA2320078A1 (en) | 1999-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Bartneck et al. | The robot engine—Making the unity 3D game engine work for HRI | |
US7507158B2 (en) | Force feedback mechanism for gamepad device | |
Orozco et al. | The role of haptics in games | |
US20020068626A1 (en) | Method related to object control of video game | |
WO1999062605A1 (en) | Recorded medium and entertainment system | |
Sreedharan et al. | 3D input for 3D worlds | |
Chong | Basics animation 02: Digital animation | |
US20050079914A1 (en) | Information processing method | |
CN112121417B (en) | Event processing method, device, equipment and storage medium in virtual scene | |
Shim et al. | FS-Pad: Video game interactions using force feedback gamepad | |
Flemming et al. | How to take a brake from embodied locomotion–seamless status control methods for seated leaning interfaces | |
US6322449B1 (en) | Mechanical interface device | |
Thorpe et al. | History and alternative game input methods | |
Jochum et al. | Programming play: puppets, robots, and engineering | |
AU746932B2 (en) | Interaction between software and an input device | |
Röber et al. | Interactive audiobooks: combining narratives with game elements | |
Malaka et al. | Using Natural User Interfaces for Previsualization. | |
Ntokos | Techniques on multiplatform movement and interaction systems in a virtual reality context for games | |
Llobera et al. | Physics-based character animation for Virtual Reality | |
Hendricks et al. | EEG: the missing gap between controllers and gestures | |
KR102146375B1 (en) | Platform system for robot simulator based on mixed reality | |
Winter et al. | Creating an Immersive Augmented Reality Experience | |
Fenollosa i Ángeles | Development of a virtual reality puzzle video game | |
Tran | Magic Interaction Design: Training people to use non-natural and hyper-natural interaction with VR games | |
Maurer et al. | A Comparative Study on Optimizing Virtual Reality Experience through User Representations in Industry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGA | Letters patent sealed or granted (standard patent) | ||
HB | Alteration of name in register |
Owner name: KLITSNER INDUSTRIAL DESIGN, LLC Free format text: FORMER NAME WAS: HASBRO, INC. |