WO2006065382A2 - Multi-user touch-responsive entertainment device - Google Patents
Multi-user touch-responsive entertainment device Download PDFInfo
- Publication number
- WO2006065382A2 WO2006065382A2 PCT/US2005/039805 US2005039805W WO2006065382A2 WO 2006065382 A2 WO2006065382 A2 WO 2006065382A2 US 2005039805 W US2005039805 W US 2005039805W WO 2006065382 A2 WO2006065382 A2 WO 2006065382A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- touch
- display surface
- user
- entertainment device
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F7/00—Indoor games using small moving playing bodies, e.g. balls, discs or blocks
- A63F7/06—Games simulating outdoor ball games, e.g. hockey or football
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/843—Special adaptations for executing a specific game genre or game mode involving concurrently two or more players on the same game device, e.g. requiring the use of a plurality of controllers or of a specific view of game data for each player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F7/00—Indoor games using small moving playing bodies, e.g. balls, discs or blocks
- A63F7/22—Accessories; Details
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/215—Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F7/00—Indoor games using small moving playing bodies, e.g. balls, discs or blocks
- A63F7/22—Accessories; Details
- A63F7/36—Constructional details not covered by groups A63F7/24 - A63F7/34, i.e. constructional details of rolling boards, rims or play tables, e.g. frame, game boards, guide tracks
- A63F2007/3655—Collapsible, foldable or rollable parts
- A63F2007/3659—Collapsible, foldable or rollable parts convertible into a suitcase
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2448—Output devices
- A63F2009/245—Output devices visual
- A63F2009/2457—Display screens, e.g. monitors, video displays
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2448—Output devices
- A63F2009/245—Output devices visual
- A63F2009/2457—Display screens, e.g. monitors, video displays
- A63F2009/2458—LCD's
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1081—Input via voice recognition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6063—Methods for processing data by generating or executing the game program for sound processing
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8088—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game involving concurrently several players in a non-networked game, e.g. on the same game console
Definitions
- the present invention relates to a multi-user entertainment that displays visual images and generates audio in coordinated response to touch and other user interaction.
- Interactive electronic game devices have evolved to somewhat complex systems that provide audio and visual output in a variety of forms. These devices can be useful to provide entertainment to children as well as serving as a learning tool.
- an entertainment device comprising a housing or base unit that supports display surface.
- a display generating device displays visual images on the display surface.
- the display generating device may be a projection-type display device or a display panel device.
- a touch/proximity sensing device detects positions of a user's appendage on the display surface in the course of a game or activity. Users may also interact with the entertainment device with at least one input controller device that comprises buttons, directional pad devices, etc.
- a control unit is responsive to signals from the at least one input controller and/or position detections from the touch/promixity sensing device to, among other operations, alter the visual image displayed on the display surface and/or generate accompanying audio in the form of game sounds, music, etc.
- the control unit controls the display generating device to rotate the displayed visual images so that they are properly aligned with the other user position.
- the control unit adjusts how it interprets touch position detections made by the touch/proximity sensing device during such a transition from one user position to another.
- FIG. 1 illustrates a perspective view of an entertainment device in accordance with an embodiment of the present invention, with the entertainment device in a first, projection position.
- FIG. 2A illustrates a perspective view of the entertainment device of FIG. 1 , with the entertainment device in an initial stage of transition to a second, storage position.
- FlG. 2B illustrates a perspective view of the entertainment device of FIG. 1 , with the entertainment device completely transformed into the second, storage position.
- FIG. 2C illustrates top perspective view of the entertainment device of FIG. 1 , with the entertainment device in the second, storage position.
- FlG. 3 is a block diagram of the electrical components of the entertainment device shown in FIGs. 1 and 2.
- FIG. 4 is a cross-sectional view of an embodiment of a display generating device for use with an entertainment device in accordance with the present invention.
- FIGs. 5 and 6 are schematic representations of alternative forms of display generating devices useful with an entertainment device according to the invention.
- FIG. 7 is a fragmentary view of part of a corner of the touch sensing device underlying the display surface according to one embodiment of the device shown in FIGs. 1 and 2.
- FIG. 8 is a top plan view of a schematic of the cross-point sensor array for the touch sensing device shown in FIG. 7.
- FIG. 9 is a schematic view of part of the display surface overlying the touch sensing device shown in FIGs. 7 and 8.
- FIGs. 10-1 1 are diagrammatic sectional views of one sensor in the sensor array shown in FlG. 8, with and without contact by a user's finger/appendage.
- FIG. 12 is an electrical block diagram of the circuitry used in the touch-sensing device shown in FIGs. 7-12.
- FIGs. 13A and 13B depict a flow chart for touch position response algorithms that interpret the operative position of a user's touch.
- FIG. 14 is a schematic diagram of an integrated display monitor/touch panel that may be used in the entertainment device shown in FIGs. 1 and 2 according to an embodiment of the invention.
- FlG. 15 is a plan view of a display surface of the entertainment device and showing how it changes the orientation of the displayed visual image according to which of multiple users is active in a game or activity.
- FIG. 16 is a flow chart for the display re-orientation algorithm depicted in FIG. 15.
- the Entertainment Device Generally
- a game/entertainment/creativity device may include a display generating device that displays visual images on a display surface.
- the display generating device may be a projection-type display device or a display panel device.
- a touch/proximity sensing device detects positions of a user appendage on the display surface in the course of a game or activity. Users may also interact with the entertainment device with at least one input controller device that comprises buttons, directional pad devices, etc.
- a control unit is responsive to signals from the at least one input controller and/or position detections from the touch/promixity sensing device to, among other operations, alter the visual image displayed on the display surface and/or generate accompanying audio.
- FIG. 1 shows a perspective view of an entertainment device 1000 in accordance with an embodiment of the present invention.
- the entertainment device 1000 of the present invention may include a housing or base unit 1100.
- the housing 1100 may include first, second, third, and fourth sides 1102, 1104, 1106, and 1108, respectively.
- the housing 1100 of the entertainment device 1000 of the present invention may also include a touch-sensitive display surface 1110 received thereon.
- the housing 1100 may include a control unit (e.g., a microprocessor - not shown) housed therein.
- the housing or base unit 1100 of the entertainment device 1000 of the present invention may also include one or more input controllers, two of which are shown at reference numerals 1120(1) and 1120(2).
- the input controllers 1120(1), 1120(2) may be removeably or non- removeably received in one or more recesses or bays 1150 included in the housing 1100 of the entertainment device 1000.
- FIG. 1 Only two input controllers 1120(1), 1120(1) are shown in FIG. 1, any number of input controllers may be utilized without departing from the scope of the present invention (see input controller 1120(3) in FlG. 2).
- input controller 1120(3) in FlG. 2
- the least one input controller 1120(1), 1120(2), 1120(3) may be operably coupled to the control unit via any known method including a wired coupling at the corresponding bay 1150, or a wireless coupling to electronics in the base unit 1100.
- the input controllers 1120(1), 1120(2), 1120(3) may comprise a conventional video game controller. Additionally, the input controllers 1120(1), 1122(2), 1120(3) may comprise any type of user-manipulable electronic input device (e.g., a directional pad, steering wheel, joystick, touchpad, dancepad, motion-sensitive implement (an implement such as a bat, racquet, paddle, etc. housing a motion sensor) without departing from the scope of the present invention. For example, the input controllers 1120(1), 1122(2), 1120(3) may include a directional pad 112Oa(I), 1120a(2) and several buttons 112Ob(I), 1120b(2), respectively.
- a directional pad 112Oa(I), 1120a(2) and several buttons 112Ob(I), 1120b(2), respectively.
- One or more of the bays 1150 may be “universal” in that it may also accept and operably connect to other accessory devices such as microphones, electric musical instruments (e.g., keyboard), and other user-manipulable electronic input devices such as dancepads, joysticks, steering wheels, etc.
- accessory devices such as microphones, electric musical instruments (e.g., keyboard), and other user-manipulable electronic input devices such as dancepads, joysticks, steering wheels, etc.
- the entertainment device 1000 in accordance with an embodiment of the present invention may also include a display portion 1130.
- the display portion 1130 of this embodiment includes a display housing 1180 with a display generating device 1175 housed therein. As shown in FIG. 1, the display generating device 1175 is supported above the touch-sensitive display surface 1110. Thus, the display generating device 1175 is positioned in a first, projection position. In this first, projection position, the display generating device 1175 is located spaced- apart from the touch-sensitive display surface 1110. In one embodiment, the display generating device 1175, in its first, projection position, may be positioned approximately 18 inches above the touch-sensitive display surface 1110.
- the display generating device 1175 of the entertainment device 1000 of the present invention is supported above the touch-sensitive display surface 1110 of the housing 1000.
- the display generating device 1175 is supported above the touch-sensitive display surface 1110 by support portions or arm members 1132, 1134, 1136, and 1138.
- the support portions 1132, 1134, 1136, and 1138 are configured to move from a projection position (illustrated in FIG. 1) to a storage/folded position (illustrated in FIG. 2).
- the support portions (arms) 1132, 1134, 1136, and 1138 may comprise a plurality of generally rigid, arcuate tubes capable of supporting the display housing 1180 above the touch-sensitive display surface 1110.
- the arm pair formed by arms 1134, 1136 and the arm pair formed by arms 1132, 1138 attach at their distal ends to display mount or clamp members 1160 and 1162, respectively.
- the clamp members 1160 and 1162 hold the display housing 1180 in position over the touch-sensitive display surface 1110.
- the display housing 1180 may be removed from the clamp members 1160 and 1162 when the arms 1132, 1134, 1136 and 1138 are rotated to their storage positions.
- the display housing 1180 has connectors (not shown) for data, control, and power that mate with complementary connectors (not shown) on one or both of the clamp members 1160, 1162, which are in turn electrically connected to the system controller electronics in the base unit 1100.
- the support portions 1132, 1134, 1136, and 1138 and the display generating device 1175 may also be rotated into a different position convenient for storage (see FIG. 2).
- the support arms 1132, 1134, 1136, and 1138 may be hollow metal tubes that act as heat dissipating "fins" to help conduct heat away from the display generating device 1175, e.g., a projector head.
- the program cartridge 1192 may contain a read only memory (ROM) storage device that contains the one or more computer or microprocessor programs to provide game/entertainment content to the control unit and the display generating device 1175.
- ROM read only memory
- the housing 1100 of the entertainment device 1000 of the present invention includes indentations or grooves 1140 configured to permit the support portions 1132, 1134, 1136, and 1138 to be moved/adjusted from the projection position to the storage/folded position. Furthermore, the housing 1100 of the entertainment device 1000 of the present invention may incorporate an audio output generating device (e.g., a speaker or speakers (for stereo sound)). Finally, the housing 1100 of the entertainment device 1000 of the present invention may incorporate a removable media storage/playback unit (e.g., a CD, DVD, ROM cartridge - not illustrated) operably coupled to the control unit. The removable media storage/playback unit may be configured to provide additional game/entertainment content to the control unit and the display generating device 1175.
- an audio output generating device e.g., a speaker or speakers (for stereo sound)
- a removable media storage/playback unit e.g., a CD, DVD, ROM cartridge - not illustrated
- FIG. 2A illustrates the entertainment device 1000 in an initial stage of transition to the second, storage position.
- the display housing 1180 has a cylindrical portion 1182 that is recessed or has a smaller diameter as compared to the remainder of the housing 1180.
- the arms 1132, 1134, 1136 and 1138 are rotated outward.
- the display housing 1180 can be disconnected from the data, control, and power connectors in one or both of the clamp members 1160, 1162 and physically removed from the clamp members 1160 and 1162 as shown in FIG. 2A.
- the input controllers 1120(1), 1120(2)... may be disconnected from the connections at their respective bays 1150 when putting the entertainment device 1000 into its storage position.
- FIG. 2A also shows a storage cover 2000 that is used for covering the entertainment device 1000 in its storage position.
- the storage cover 2000 comprises a generally planar body portion 2010 having a raised section 2100 in which a storage recess 2110 is formed for storing the display housing 1180.
- the storage cover 2000 further includes lateral portions 2120 and 2122 on the sides of the body portion 2010. Each lateral portion 2120 and 2122 has two additional structural storage features. First, lateral portions 2120 and 2122 have curved bottom surfaces 2130 and 2132, respectively, which mate with and snap-fit to complementary surfaces on the base unit 1100.
- the lateral portions 2120 and 2122 have curved recesses 2140 and 2142 on their top surfaces that receive and mate with and snap-fit to portions of the clamp members 1160 and 1162, respectively.
- the storage cover 2000 also has a handle 2200 extended from the main body portion 2010.
- FIGS. 2B and 2C the entertainment device 1000 is shown fully folded into the second, storage position 1200 with the entertainment device 1000 and the storage cover 2000 locked together as a single integrated unit.
- the main body portion 2010 of the storage cover 2000 covers and protects the touch-sensitive display surface 1110 of the entertainment device 1000, as well as the controller bays 1150 at each user position.
- the clamp members 1160 and 1162 are shown snapped to the curved surfaces 2140 and 2142 of the storage cover 2000.
- a recess cover 2300 may be provided that removeably snaps over the storage recess 2110 to contain the display housing 1180 therein.
- FIG. 3 an electrical system block diagram is shown for the entertainment device 1000.
- the electrical system of the device 1000 comprises a system controller 3000 that is connected to the various sub-systems.
- the system controller 3000 may be a commercially available microprocessor device, such as those sold by Sharp Electronics, for example.
- touch-sensitive surface subsystem there is touch surface sensing circuitry 3100 connected to a touch surface controller 3200.
- the touch surface sensing circuitry 3100 is responsive to touch-related conditions on the display surface 1110 (FIGs. 1 and T).
- the electrical system of the device 1000 may include an external memory subsystem including a flash memory device 3300 and a larger working memory, such as a DRAM device 3310.
- the external memory sub-system and its memory devices are useful for storing game parameters, such as user scores, user preferences, etc., for a game.
- These external memory devices 3300, 3310 may also be used by the controller 3000 when executing one or more processes associated with a particular game or activity stored in the ROM cartridge 1192.
- the electrical system of the device 1000 of the present invention may include a projection sub-system 3400 that includes the display generating device 1175 (FlGs. 1 and T).
- the components of the display generating device 1175 may vary with the type of projection technology used.
- the projection sub-system 3400 comprises a light source 3410, an image display panel or portion 3420, a serial izer-deserializer (SERDES) 3430.
- SERDES serial izer-deserializer
- the electrical system includes a stereo coder/decoder (CODEC) 3500 that is connected to the system controller 3000.
- the CODEC 3500 is responsive to commands and data received from the system controller 3000 to produce sound in the form of music, speech, or other sound that is synchronized to the data representing the displayed visual images produced by the display generating device 1175.
- Audio output may be produced by left and right speakers 3510 and 3520, as well as headphone ports 3530 and 3540 connected to the CODEC 3500.
- the electrical system of the device 1000 may also include a program cartridge interface 3600 that communicates data stored on a ROM cartridge 1192 to the system controller 3000.
- the electrical system of the device 1000 may utilize a memory card interface 3700 connected to the system controller 3000.
- the memory card interface 3700 may support one or more of a variety of memory card formats, including MultimediaTM memory card, SmartmediaTM, and CompactflashTM.
- the accessory block shown at reference numeral 3650 may include controllers/interfaces for devices such as, audio system, television, compact disk (CD) or digital video disk (DVD) or other accessory devices such as musical instruments (e.g., keyboard), optical devices (cameras), and other user-manipulable electronic input device such as dancepads, joysticks, steering wheels, etc..
- devices such as, audio system, television, compact disk (CD) or digital video disk (DVD) or other accessory devices such as musical instruments (e.g., keyboard), optical devices (cameras), and other user-manipulable electronic input device such as dancepads, joysticks, steering wheels, etc.
- the electrical system of the device 1000 provides for communication between control buttons 1170 (e.g., power, volume, brightness, mode, reset, etc.) and the system controller 3000. Additionally, a power source (AC or DC, not shown) is utilized to power all of the components of the device 1000.
- control buttons 1170 e.g., power, volume, brightness, mode, reset, etc.
- a power source AC or DC, not shown
- the system controller 3000 coordinates displayed image data with the positions of a player's hand or finger on the touch-sensitive surface as gathered by the touch surface controller 3200. In so doing, the system controller 3000, based on instructions contained in a particular ROM cartridge 1192, will generate image display output and/or audio output, and change its interaction to another player using the device 1000.
- Display Generating Device based on instructions contained in a particular ROM cartridge 1192, will generate image display output and/or audio output, and change its interaction to another player using the device 1000.
- FIG. 4 illustrates a cross-sectional view of an embodiment of a display generating device 1175 for use with an entertainment device 1000 in accordance with the present invention.
- the display generating device 1175 may be an image projection system that projects a visual image on the touch-sensitive display surface 1110.
- FIG. 4 illustrates an example of a projection type display generating device 1175. If such a projection system is utilized, the display generating device 1175 may include a first lens 310, an image display portion 3420, an illumination source 3410, and a second lens 340.
- a first lens 310 operable within the display generating device 1175 of the present invention is a projection- type lens.
- the image display portion 3420 operable within the display generating device 1175 of the present invention may comprise a transmissive LCD panel and the second lens 340 may comprise a condensing lens.
- the illumination source 3410 may comprise any known light source such as an incandescent source, a KPR bulb, a halogen source (e.g., Xenon), or a light-emitting diode (LED) such as an ultra-bright white LED.
- the illumination source 3410 passes through the second lens 340 (condensing lens) and into the rear surface of the image display portion 3420, such as a transmissive TFT LCD panel.
- the image on the image display portion 3420 is then transmitted to and through the first lens 310 (projection-type lens), where the image is projected from the first lens 310 and onto the touch-sensitive display surface 1110.
- the display generating device 1175 may also comprise a focus adjustment mechanism (not shown) to more clearly focus the image onto the touch-sensitive display surface 1110.
- FIG. 5 illustrates a schematic representation of an alternative embodiment of a display generating device 1175 for use with an entertainment device 1000 in accordance with the present invention.
- a super-bright LED light source 3410 is condensed via condensing lens pair 340A and 340B onto the rear surface of image panel 3420, which in this example is a transmissive liquid crystal on silicon (LCOS) panel.
- LCOS liquid crystal on silicon
- the light and image then passes through a projection lens 310 onto the touch-sensitive display surface 1110.
- the touch- sensitive display surface 1110 utilized in the present invention may comprise a white, metallized, surface.
- FIG. 6 illustrates a schematic representation of an additional alternative embodiment of a display generating device 1175 for use with an entertainment device 1000 in accordance with the present invention.
- the super-bright LED light source 3410 is condensed onto the front surface of a reflective LCOS panel 3425 via a mirror 360.
- the lighted image reflected off of the LCOS panel 3425 passes through a projection lens 310 onto the touch-sensitive display surface 1110, which again may be a white metallized surface.
- any type of display generating system may be used without departing from the spirit and scope of the present invention.
- a rear projection system could be utilized.
- display generating systems such as an LCD panel, plasma display panel, or a digital light processing (DLP) device could be utilized to perform the function of the display generating device 1175.
- Still other image generating technologies that are useful in connection with the device 1000 are a high temperature polysilicon panel (HTPS) and a MEMS reflective display device.
- HTPS high temperature polysilicon panel
- MEMS reflective display device a MEMS reflective display device.
- the system controller 3000 (FIG. 3) calibrates it to ensure that the image projected onto the touch-sensitive display surface 1110 is sized and oriented to match the corresponding touch-sensitive surface underlying the display surface 1110 to achieve a desirable interaction with a game or activity executed by the system controller 3000. This may involve mechanical adjustment of the projection system (focus, etc.) and/or the use of extra "border pixels" in the projected image to move the image properly onto the display surface 1110 in an alignment with the underlying touch-sensitive surface.
- a projection type display generating device may be rotated from its normal projection position so as to project images onto a wall or other surface, rather than onto the display surface 1 1 10. This feature may be useful in the event a user wishes to view images or watch a video presentation on a DVD, CD, etc.
- the touch-sensitive display surface 1110 may include a vellum projection screen. More specifically, the vellum projection screen may be a vacuum-metallized vellum screen with a mirrored back portion for improved reflectivity. As referenced above, the touch-sensitive display surface 1110 need not include a projection-type screen (a projector and a separate screen), and may comprise additional appropriate integrated touch-sensitive display surfaces such as an LCD or plasma touch panel display, as described hereinafter in connection with FIG. 14. Touch-Sensitive Surface Sub-System
- FIGs. 7-13B an example of a touch-sensitive screen technology useful in connection with the entertainment device 1000 according to the invention will be described.
- This touch-sensitive screen technology is described in co-pending U.S. Patent Publication No. 2004/0043371 Al , published March 4, 2004, corresponding to U.S. Patent Application No. 10/448,582, filed May 30, 2003, entitled “Interactive Multi-Sensory Reading System Electronic Teaching/Learning Device,” the entirety of which is incorporated herein by reference.
- This touch screen technology also is used in the publicly available Fisher-Price PowerTouchTM Learning System.
- a sensor array 142 located directly beneath a plastic spacer 515 forming recess surface 130.
- the plastic spacer 515 or the recess surface 130 may serve as the display surface 1110 for the image generated by the display generating device 1175 if a projection system is used.
- Spaced beneath sensor array or matrix 142 is an electrically conductive metal plate 510.
- the sensor array or matrix 142 may include two sets of generally parallel, individual separate and separated conductive lines arranged as a plurality of spaced apart, column or vertical conductive lines (also referred to as vertical grid lines) 248 and a plurality of spaced apart, row or horizontal conductive lines or traces (also referred to as horizontal grid lines) 246 transverse and preferably perpendicular to the plurality of column conductive lines 248.
- the sets of lines 246, 248 are referred to as "rows” or “columns” for convenience, “rows” run east-west/left-right while “columns” are perpendicular (or otherwise transverse) to such "rows” running north-south/up-down, but the nomenclature could be reversed.
- the set of column conductive lines 248 and the set of row conductive lines 246 are separated by an electrical ly-insulative spacer, for example a Mylar plastic sheet.
- the row and column conductive lines 246, 248 are printed in conductive inks on opposite sides of the Mylar sheet to provide electrical isolation between the sets and form the sensor matrix 142.
- the sensor matrix 142 includes sixteen rows 246 and sixteen columns 248 of the conductive lines or traces however different numbers of either or both could be utilized. Each point where a row 246 and column 248 cross creates a single individual "cross-point" sensor. The sixteen by sixteen line array therefore creates two hundred and fifty-six individual cross-point sensors.
- FIG. 9 depicts schematically part of the display surface 1110 that overlies the sensor array 142 of the device 1000, with the word "BALL" projected and displayed on the display surface 1110. Also, shown in phantom is the outline of a user's hand, primarily the user's thumb, being placed on the touch-sensitive display surface 1110.
- the operation of the entertainment device 1000 allows a user to select any active area on the display surface 1110 by touching or simply placing a finger, thumb, etc., sufficiently closely to the selected area. Upon selection of this active area in this manner, the system controller 3000 of the entertainment device 1000 may generate and output a certain audible message or visual display responsive to this selection.
- the system controller 3000 of the entertainment device 1000 may produce a spoken audio output "BALL" and the displayed graphical representation of a ball 9000 may change color.
- the audible message and video output is generated in direct response to the user touching the displayed word "BALL" on the touch-sensitive display surface 1110. Different audible messages and video output would be generated if the user touched other areas of the touch-sensitive display surface 1110. Touching the ball graphic on the display surface could produce a sound of a bouncing ball (and or the image of a bouncing ball).
- each word and/or image displayed on the touch-sensitive display surface 1110 may be mapped to one or more x and y coordinate pairs of the sensor array 142. For instance, the word "BALL" is located at Row 5, Column 4 and Row 5, Column 5 of the sensor array 142. This map location is stored in memory (e.g., the memory devices 3300, 3310 of FIG. 3) along with the associated audible message that is played when either cross-point sensor location is selected.
- FIGs. 10-11 show examples of three cross-sections of the sensor array 142 without and with an overlying display surface 1110.
- FIGs. 10-12 show a plastic spacer 515, a plurality of the spaced apart column (vertical) traces 248, the non-conductive (e.g. Mylar) sheet 525 and one of the spaced apart row (horizontal) traces 246 transverse to the plurality of column traces 248.
- the non-conductive sheet 525 supports and separates the column traces 248 from the row traces 246.
- the conductive plane 510 in the form of a metal plate is connected to system ground and parallel to and spaced away from the sensor array 142.
- the plastic spacer 515 which forms the recess surface 130 may be approximately 0.080" thick and is placed on top of the array 142 to act as an insulator so that a touch surface of a sensor is separated from the matrix 142 by at least this amount.
- the spacer 515 may be a styrene or ABS with a dielectric constant between about 2 and 3 although the thickness and dielectric constant can be adjusted to achieve the desired sensitivity.
- the function of the spacer 515 is to provide a stable response from the matrix 142 (when touched by ftnger/appendage 505).
- the width and thickness of the column traces 248 (vertical columns) and row traces 246 (horizontal rows) should be kept to a minimum at the cross-points to reduce the capacitive effect at each of the cross-points but are preferably increased between the cross-points and around the cross-points, for example, by widening the individual row and column traces into four pointed stars or diagonal squares or the like around and between the cross-point locations.
- the conductive plane 510 is spaced approximately one-quarter inch (5 mm) below the matrix 142.
- the conductive plane 510 provides shielding for the matrix 142 and as a result, affects the area sensed around each cross-point in the matrix 142.
- the individual traces 246, 248 are extended to side and bottom edges of the sheet 525 supporting the traces.
- shorter traces 530 and 535 are extended from the side and bottom edges, respectively, of the sheet 525, one shorter trace 530 or 535 on either side of each sensor trace 246 or 248, respectively (see FlG. 8).
- the shorter traces 530 and 535 are all connected to system ground through or with the conductive plane 510.
- the horizontal traces 530 extend inwardly from the vertical edge to just beyond where the row traces 246 widen out to form terminals and, with a uniform length, provide some impedance control.
- the vertical traces 535 extend from the bottom edge up to a point where the vertical traces 248 begin to run parallel, just below where those traces are flared and to within about one-half inch (12 mm) of the lowest cross-points. Traces 535 prevent cross coupling between the column traces 248 when the columns are being driven by an oscillator.
- baseline or reference values of signals generated by the sensor matrix 142 are read and stored without human interaction with the arrays to obtain a reference value for each cross-point.
- the reference value of each cross-point sensor is individually determined and updated.
- each is a running "average" of successive scan values (e.g., approximately sixteen) for the cross-point. Successive scans are compared to the reference values to determine the proximity of a human finger or other extremity. Data may be accumulated starting at zero when the device 1000 is powered on.
- FIG. 12 is an electrical block diagram of the touch surface sensing circuitry 3100 and the touch surface controller 3200.
- the touch surface controller 3200 is a dedicated microprocessor controller such as the publicly available Sunplus SPL 130A microprocessor.
- the touch surface sensing circuitry 3100 comprises a column driver circuit 254, a row select circuit 258, a synchronous detector, multiplexer and filter circuit 260 that processes the raw sensor signals and passes processed signals to an analog to digital converter 262 for digitization.
- the functions of touch surface controller 3200 might be performed by the device system controller 3000 (FlG. 3).
- the touch surface sensing circuitry 3100 further comprises the cross-point matrix or sensor array 142 and a signal oscillator 252, which powers the sensor array 142 and controls the detector 260.
- Operation of the touch surface sensing circuitry 3100 is as follows (and is illustrated in FlGs. 13A and 13B).
- Firmware associated with touch surface controller 3200 directs the column driver circuit 254 to pass an RF excitation signal, for example, a 250 kHz, 3300 millivolt square wave signal, from the signal oscillator 252 to column traces 248 of the sensor array 142.
- the firmware also directs the row select circuit 258 to generate appropriate control signals sent to the row sensor circuit (not shown) to connect a row trace 246 in the sensor array 142 to the synchronous detector, multiplexer and filter circuit 260.
- the touch surface controller 3200 further controls the transfer of data from the synchronous detector, multiplexer and filter circuit 260, which generates a DC level analog voltage signal through A/D converter 262.
- the row traces 246 may be scanned bottom to top while the column traces 248 are driven innermost to outermost.
- the sensor array 142 is cyclically and continually scanned, and the results for each cross-point sensor are compared with the stored reference values, which are themselves cyclically and continuously updated. If any individual cross-point sensor value has a differential from its reference value that is greater than a predetermined or threshold amount ("threshold”), the touch surface controller 3200 will mark the point as "touched” or "selected".
- threshold a predetermined or threshold amount
- the threshold is set for less than 200 millivolts, between about 190 and 200 millivolts, for each cross-point sensor. If the measured voltage value for the cross-point being sensed is less than the reference value in memory by an amount equal to or greater than the threshold amount, the point is considered touched and is "marked” as such by the sensor touch surface controller 3200. If the difference is less than the threshold, the reference value is updated each 64 milliseconds period (full scan time), resulting in a settling of the reference values after about one second. After the sensor array 142 is scanned, cross-points that have been "marked” as a touched for two scan cycles are considered valid and selected for further processing by a "best candidate" algorithm as will be described.
- each cross-point data value is initially compared to a "High Limit" value. If the data value exceeds this High Limit value, it is ignored as a candidate for that scan and ignored for updating the reference value for that sensor.
- the purpose of the High Limit value is to prevent abnormally high data values from causing a cross-point sensor to appear permanently pressed.
- each time the data value associated with a cross- point sensor is read it is compared against the reference value, which may be thought of and herein referred to as a "Running Average" associated with that cross-point sensor. If the data value is less than the Running Average minus the threshold, the cross-point sensor is considered “touched” for that scan.
- the threshold is the fixed data value mentioned above (i.e. 190 to 200 millivolts) that represents the minimum deflection which is expected to indicate that a cross- point sensor is considered touched.
- the data value is used to update the Running Average.
- the Running Average for each point is set to zero.
- the data value is used to update the Running Average for that point.
- the formula used to compute the new Running Average is as follows:
- New Running Average Running Average + (data value-Running Average)/16.
- the preferred "running average” is not truly an average but rather a convergence algorithm.
- the reference value/running average algorithm can be fooled by situations where high levels of interference exist and the cross-point sensor readings climb significantly. Without the High Limit cut-off, abnormally high data values (due to a continuous noise source) could eventually result in an abnormally high Running Average for a given cross-point sensor. Then, when the scanned data values return to their nominal value range, if the data values being scanned are low enough such that the data values are greater than the abnormally high Running Average minus the threshold, the cross-point sensor will be considered touched.
- the touched points are processed to identify a "best candidate".
- the best candidate is the cross-point sensor selected by the touch surface controller 3200 as being the point most likely to have been selected by the user in touching the sensor.
- it is the touched point which is highest (most northern/Top) or the highest and most left (i.e. most northwestern/Top Left) if two potential candidates of equal height are activated on the sensor array 142.
- the cross-point sensor preferably must be "touched” for two consecutive 64 millisecond scans to be considered as the new most northwestern point of the sensor.
- the touch surface controller 3200 first identifies a set of touched sensors. It next identifies those which have been touched for at least two consecutive 64 millisecond cycles. These are the new most northwestern candidate sensors. Once the best candidate has been chosen, its identification/location is communicated from the touch surface controller 3200 to the system controller 3000.
- a "Southern Lockout” algorithm takes effect for the sensor array 142.
- the Southern Lockout algorithm causes any point of the same array touched in subsequent scans below the new most northwestern point to be ignored until the earlier of one second expiration while the new most northwestern point remains selected, or the new most northwestern point is released. After the lockout, all cross-points of the array become candidates for new most northwestern point. This algorithm covers the situation where the user rests the heel of the pointing hand on the array after finger touching the array (as a young child may be prone to do).
- a "Peak Search” algorithm may be employed after a new most northwestern point of the sensor array 142 is identified.
- the deflection of the cross point sensors immediately East (right), South (below) and Southeast (below right) of the new most northwestern point sensor are examined for touch and the relative deflections of any touched sensor of the four compared to one another.
- the one sensor of those up to four sensors having the greatest deflection i.e. change from reference value/Running Average) is selected as the "Best Candidate" and its identity/location/position is passed to the main (base unit) system controller 3000.
- the device 1000 will determine if there are multiple hands placed touch surface. In the event that the system controller 3000 sees two hands placed on the sensor, it will look to see if either input is a clearly defined most northern point. If so, it will select this input as the best candidate. Instead of having to generate an audio output to direct the user to use "one finger at a time” or any other appropriate statement when the device 1000 cannot determine with reasonable accuracy the likely input, this technique can select a "best candidate" based on the above- described algorithm.
- touch-sensitive surface or position detection technologies may be utilized without departing from the scope of the present invention.
- analog resistive or capacitive touch panels may be used, digital camera CCD technology, so-called gesture recognition technology, heat sensitive, color sensitive, pattern sensing, object sensing or any other contextual sensing technology based on electro-physical material properties, photo- reflective properties or photo-absorption properties.
- FIG. 14 Still another alternative is to use a LCD monitor with an integrated touch panel.
- This alternative embodiment is shown in FIG. 14, where the integrated LCD monitor/touch panel is shown at reference numeral 4000.
- This integrated LCD monitor/touch panel would replace both the display generating device 1175 and the display surface 1110 (illustrated in FIG. 1).
- the monitor/touch panel 4000 has its own touch surface sensing circuitry 3100 and touch surface controller 3200 that are in turn connected to the system controller 3000.
- the system controller 3000 responds to touch position information supplied to it by the touch surface controller 3200 and also generates display image data that is supplied (through the appropriate intervening display driver circuitry) to the monitor/touch panel 4000.
- Numerous models of integrated display monitor/touch panels are known in the art and may be used in accordance with the present invention.
- Examples of other types of touch or proximity sensing technologies that may be used with the entertainment device 1000 of the present invention include pressure-sensitive switch matrices such as a Mylar® switch matrix, proximity sensing antenna arrays and proximity sensing capacitive arrays. Some examples of additional appropriate touch-sensitive display surfaces are LCD or plasma touch panel displays.
- an image generated by the display generating device 1175 is displayed on the touch-sensitive display surface 1110.
- the touch-sensitive display surface 1110 and the at least one input controller 1120(1), 1120(2) are both operably coupled to the system controller 3000.
- the system controller 3000 is responsive to signals from the touch-sensitive display surface 1110 and, the at least one input controller 1120(1), 1120(2) to alter the visual image displayed on the touch-sensitive display surface 1110 in response to a user's interaction with the touch-sensitive display surface 1110 and/or the at least one input controller 1120(1), 1120(2).
- a user of the entertainment device 1000 in accordance with the present invention can use either the touch-sensitive display surface 1110, the at least one input controller 1120(1), 1120(2), or both the touch-sensitive display surface 1110 and the at least one input controller 1120(1), 1122(2) to provide input to the system controller 3000 to alter the image (move portions of, change, re-orient, etc. - as opposed to merely a brightness control or an on/off control) displayed on the touch-sensitive display surface 1110.
- FIGs. 15 and 16 another feature of the entertainment device 1000 will be described.
- the device 1000 is intended for use by multiple players, and in the examples described herein there are four play positions at each of four sides 1102, 1104, 1106 and 1108 of the generally rectangular base housing 1100. Consequently, in the course of a game or activity, it is necessary to transition from one player to another player, but the players are positioned at different orientations with respect to the touch-sensitive display surface 1110. Therefore, the system controller 3000 needs to recognize this and re-orient the image displayed by the display generating device 1175 (projected or displayed on a display panel) and also reorient how it responds to touch commands on the display surface 1110 with respect to the reoriented images.
- the system controller 3000 controls the display generating device 1175 to rotate the displayed visual images so that they are properly aligned with the new currently active user position. In addition, the system controller 3000 adjusts how it interprets touch position detections made by the touch/proximity sensing device during such a transition from one user position to another user position.
- Step 5010 shows normal execution of a game or activity with the current player.
- the game or activity involves interaction with Player 1.
- a message is displayed for Player 1 involving a selection after which the game or activity moves to Player 2.
- a displayed message to Player 1 in this example is:
- the system controller 3000 detects that Player l's turn is over. This corresponds to step 5020 in which the system controller 3000 detects a player transition event.
- the system controller 3000 determines the next player according to rules of the game or activity. For example, Player 2 may be the next player.
- step 5040 the system controller 3000 reorients (e.g., rotates) the image data to be displayed so that it is aligned properly for the new player, by rotating the image data by 90 degrees to the right on the touch-sensitive display surface 1110 so that it appears oriented and intended for Player 2.
- the system controller 3000 also, in step 5050, re-orients how it responds to touch surface commands according to the reoriented displayed image. For example, to respond to touch or proximity detected commands from Player 2, the system controller 3000 adjusts (rotates) by 90 degrees to the right those positions that are "hot” and cause a certain action in response to the touch position signals it receives from the touch surface controller 3200, representing touch positions of Player 2.
- FIG. 15 A partial view of the conductive wires associated with the (M x N) sensor array 142 is shown in FIG. 15 to depict how the system controller 3000 is programmed to re-orient how it responds to touches from a user since the sensor array 142 is fixed, but the relative positions on the sensor array where a user may touch to cause a particular action (visual and/or audible) with respect to a displayed visual image is different from each user position 1102, 1104, 1106, and 1108.
- This re-orientation process is repeated when transitioning from Player 2 to any other player. It should be understood that if only 2 or 3 players are active in a particular game or activity, the system controller would know to re-orient the image display data and touch position responsiveness accordingly. Moreover, this re-orientation process can be applied to an entertainment device that has fewer than 4 or more than 4 player positions such that the re- orientation is not a simple 90 degree adjustment.
- This re-orientation process 5000 applies for a display generating device that is a display panel or monitor 4000 as well as an image projection system 1175. In the case of an image projection system 1175 (such as in FIG. 1), it is possible that the image can be re-orientated by adjusting one or more optical devices in the projected image path. However, it may be more desirable to re-orient the raw image data prior to its projection.
- the following are generic examples of games or activities that may be played on the entertainment device 1000 of the present invention.
- the instructions, scripts, programs for these games or activities may be embodied in a removable memory cartridge device 1192, as described above.
- the games or activities are software programs containing digital data for animated characters accompanied by voice, music, and other graphical elements.
- the games/activities may involve sequential, interactive, narrative stories that containing puzzles, activities or games interwoven as challenges to provide a progressive rewarding type experience for the players.
- One type of game is an animated adventure game where one or more animated characters are displayed and the character(s) negotiate a variety of activities, such as an underwater amusement park.
- Each player may select a particular character and negotiate a simulated displayed game board, for example, collecting certain items in order to win the game.
- a player's character may progress on the displayed game board using an electronic or virtual roll of the dice, for example.
- the player may be prompted, through visual and audio stimulus, to engage in a particular activity in order to earn a particular item or "ticket" award that counts towards winning the game.
- a player may accumulate tickets in order to redeem them for certain animated or displayed items.
- a player may engage in these so-called mini-games or activities (including educational or learning activities) using the input controllers or the touch-sensitive display screen. These mini- games may be distributed randomly throughout the game board each time a new game is started.
- Portions of the visual display proximate each player's position at the entertainment device may be dedicated to tracking each player's digital scorecard concerning their progress in the game.
- the scorecard may show a player's character and which items the player has collected.
- the items that can be purchased may be used by a player during play of the game (e.g., a rolling bonus, a time bonus, etc.), while others may be used against opponents (lose a turn, etc.).
- the game may also include sudden appearance of certain animated characters that give bonus tickets to certain players, for example, or play special side games or activities.
- Another type of game may involve a digital book consisting of a combination of a traditional storybook, a children's activity book and web-type flash games.
- a player or user becomes part of the adventure, helping the animated characters complete certain challenges and reach their goals.
- Each so-called "page" of the storybook includes a full-screen combination of artwork, a story line, object identification and animated "hot spots".
- the story is read to the user, or as an animated character speaks, the accompanying text will appear on-screen and highlight.
- Each phrase or sentence will highlight individually as those words are also heard as voiceover.
- several objects are tagged as "identifiable hotspots.” When a child touches one of these objects, that word or phrase is said aloud.
- Certain areas and objects on the pages are tagged for special animations, so that if a child touches that area, the name of that object is said aloud, and an animation or other reward will be revealed.
- certain pages of the storybook may contain mini-games, activities or challenges (including educational or learning activities) related to the storyline.
- the game and activity examples described above highlight the necessity for reorienting the displayed visual image according to which player is active in a game.
- the animated characters may be intended for a particular player. Consequently, the device needs to keep track of players' turns in the game, and re-orient certain displayed visual images to that player whose turn it currently is.
- the device if the game calls for detecting a touch or proximity of a command from a player, the device also re-orients on which positions on the sensor array that it needs to respond to for the currently active player.
- the housing 1100 of the entertainment device 1000 of the present invention may include headphone jacks for a user's convenience.
- the entertainment device 1000 of the present invention may include a gesture recognition system, including a camera and/or sensors to sense, model, and react to a user's hand motions, adding an extra dimension of interactivity to the entertainment device 1000.
- the entertainment device 1000 of the present invention may include a rechargeable power source.
- the entertainment device 1000 of the present invention may also include night vision goggles, a magnifying glass, or a special optical device that would allow a user to reveal secret codes, cards, letters, or other information displayed in certain wavelengths of light on the touch-sensitive display surface 1110.
- the entertainment device 1000 of the present invention may include deluxe input controllers which include all of the features of the at least one input controller 1120(1), 1120(2), 1120(3), 1120(4) and also may include an onboard display screen displaying individual user messages (e.g., things like scrabble letters, hidden game clues, etc.).
- the entertainment device 1000 of the present invention may include a memory unit (removeable or non- removeable) for storing game/player related information (such as high scores, etc.).
- the housing 1100 of the entertainment device 1000 of the present invention may include light sources to identify which user is in control of the entertainment device 1000 i.e., (which user's turn it is to control the entertainment device 1000).
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MX2007005057A MX2007005057A (en) | 2004-11-05 | 2005-11-04 | Multi-user touch-responsive entertainment device. |
AU2005317037A AU2005317037A1 (en) | 2004-11-05 | 2005-11-04 | Multi-user touch-responsive entertainment device |
CA002586386A CA2586386A1 (en) | 2004-11-05 | 2005-11-04 | Multi-user touch-responsive entertainment device |
JP2007539340A JP2008518686A (en) | 2004-11-05 | 2005-11-04 | Multi-user touch sensitive entertainment device |
EP05851329A EP1807162A2 (en) | 2004-11-05 | 2005-11-04 | Multi-user touch-responsive entertainment device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US62510804P | 2004-11-05 | 2004-11-05 | |
US60/625,108 | 2004-11-05 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2006065382A2 true WO2006065382A2 (en) | 2006-06-22 |
WO2006065382A3 WO2006065382A3 (en) | 2006-09-08 |
Family
ID=36028413
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2005/039805 WO2006065382A2 (en) | 2004-11-05 | 2005-11-04 | Multi-user touch-responsive entertainment device |
Country Status (8)
Country | Link |
---|---|
US (1) | US20060183545A1 (en) |
EP (1) | EP1807162A2 (en) |
JP (1) | JP2008518686A (en) |
CN (1) | CN101052446A (en) |
AU (1) | AU2005317037A1 (en) |
CA (1) | CA2586386A1 (en) |
MX (1) | MX2007005057A (en) |
WO (1) | WO2006065382A2 (en) |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4388004B2 (en) * | 2005-10-04 | 2009-12-24 | 株式会社スクウェア・エニックス | Image generating apparatus and method, program, and recording medium |
US20070260765A1 (en) * | 2006-04-05 | 2007-11-08 | Rita Cooper | Apparatus and system for displaying an image in conjunction with a removable memory cartridge |
US20080293470A1 (en) * | 2007-05-23 | 2008-11-27 | Ian Douglas Proud | Electronic outdoor game apparatus |
EP2257969B1 (en) * | 2008-02-28 | 2017-12-20 | 3M Innovative Properties Company | Methods of patterning a conductor on a substrate |
US20100045609A1 (en) * | 2008-08-20 | 2010-02-25 | International Business Machines Corporation | Method for automatically configuring an interactive device based on orientation of a user relative to the device |
AU2010200110A1 (en) * | 2009-02-06 | 2010-08-26 | Aristocrat Technologies Australia Pty Limited | A gaming system and a method of gaming |
US8581856B2 (en) * | 2009-05-27 | 2013-11-12 | Microsoft Corporation | Touch sensitive display apparatus using sensor input |
US8070552B2 (en) * | 2009-09-28 | 2011-12-06 | Mattel, Inc. | Repositionable infant entertainment device |
US8118680B2 (en) * | 2010-01-08 | 2012-02-21 | Ami Entertainment Network, Inc. | Multi-touchscreen module for amusement device |
US9390578B2 (en) | 2010-01-08 | 2016-07-12 | Ami Entertainment Network, Llc | Multi-touchscreen module for amusement device |
US20120007808A1 (en) | 2010-07-08 | 2012-01-12 | Disney Enterprises, Inc. | Interactive game pieces using touch screen devices for toy play |
US9274641B2 (en) | 2010-07-08 | 2016-03-01 | Disney Enterprises, Inc. | Game pieces for use with touch screen devices and related methods |
US8540572B2 (en) | 2011-10-19 | 2013-09-24 | Brad Kaldahl | Video game controller for multiple users |
US8523674B2 (en) | 2011-10-19 | 2013-09-03 | Brad Kaldahl | Video game controller for multiple users |
US8740707B1 (en) | 2011-10-19 | 2014-06-03 | Brad Kaldahl | Video game controller for multiple users |
TWI498771B (en) * | 2012-07-06 | 2015-09-01 | Pixart Imaging Inc | Gesture recognition system and glasses with gesture recognition function |
KR102131257B1 (en) | 2013-07-02 | 2020-07-07 | 삼성전자주식회사 | Electronic device and method for controlling multi- window in the electronic device |
US9207827B1 (en) * | 2014-10-14 | 2015-12-08 | Disney Enterprises, Inc. | Multi-touch surface extension using conductive traces and pads |
GB2559296A (en) * | 2015-08-25 | 2018-08-01 | Bernard Davis George | Presenting interactive content |
CN105338231A (en) * | 2015-12-02 | 2016-02-17 | 深耘(上海)电子科技有限公司 | Digital photographing device |
CN105457273B (en) * | 2015-12-30 | 2023-05-02 | 生迪智慧科技有限公司 | LED lighting device and system |
CN106178546A (en) * | 2016-09-25 | 2016-12-07 | 依云智酷(北京)科技有限公司 | A kind of intelligent toy projecting touch-control |
US10687023B1 (en) * | 2017-08-14 | 2020-06-16 | Visualimits, Llc | Gaming table events detecting and processing |
LU100922B1 (en) * | 2018-09-10 | 2020-03-10 | Hella Saturnus Slovenija D O O | A system and a method for entertaining players outside of a vehicle |
CN113727353A (en) * | 2021-08-27 | 2021-11-30 | 广州艾美网络科技有限公司 | Configuration method and device of entertainment equipment and entertainment equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3857022A (en) * | 1973-11-15 | 1974-12-24 | Integrated Sciences Corp | Graphic input device |
US20030123032A1 (en) * | 1999-03-03 | 2003-07-03 | 3M Innovative Properties Company | Compact integrated front projection system |
US20030137639A1 (en) * | 2002-01-21 | 2003-07-24 | Liang-Ta Yang | Projection device |
WO2003102895A1 (en) * | 2002-05-30 | 2003-12-11 | Mattel, Inc. | Interactive multi-sensory reading system electronic teaching/learning device |
US20040070625A1 (en) * | 2000-09-08 | 2004-04-15 | Albert Palombo | Multiuser electronic platform-screen, in particular for games, and method for controlling clerance for executing programmes such as games |
Family Cites Families (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3609695A (en) * | 1968-05-10 | 1971-09-28 | Honeywell Inc | Display-entry data terminal |
US3973773A (en) * | 1972-01-24 | 1976-08-10 | Marvin Glass & Associates | Game apparatus |
US4353552A (en) * | 1979-02-23 | 1982-10-12 | Peptek, Incorporated | Touch panel system and method |
US4371164A (en) * | 1980-02-19 | 1983-02-01 | Bally Manufacturing Corporation | Projected gaming method and apparatus |
US4421313A (en) * | 1981-12-15 | 1983-12-20 | Mattel, Inc. | Display system having multiple viewing stations |
JPH079530B2 (en) * | 1985-07-31 | 1995-02-01 | 株式会社リコー | Overhead Projector |
US4921343A (en) * | 1985-09-05 | 1990-05-01 | Fuji Photo Film Co., Ltd. | Overhead projector |
US4976429A (en) * | 1988-12-07 | 1990-12-11 | Dietmar Nagel | Hand-held video game image-projecting and control apparatus |
US4976438A (en) * | 1989-03-14 | 1990-12-11 | Namco Ltd. | Multi-player type video game playing system |
US4978218A (en) * | 1990-03-01 | 1990-12-18 | Minnesota Mining And Manufacturing Company | Folding arm for overhead projector |
JPH04298731A (en) * | 1990-12-03 | 1992-10-22 | Sony Corp | Portable projector |
JP2535395Y2 (en) * | 1991-09-04 | 1997-05-14 | 株式会社センテクリエイションズ | Projection toys |
US6141000A (en) * | 1991-10-21 | 2000-10-31 | Smart Technologies Inc. | Projection display system with touch sensing on screen, computer assisted alignment correction and network conferencing |
US5736975A (en) * | 1996-02-02 | 1998-04-07 | Interactive Sales System | Interactive video display |
NL1004407C2 (en) * | 1996-11-01 | 1998-05-08 | Adar Golad | Computer game. |
US6061177A (en) * | 1996-12-19 | 2000-05-09 | Fujimoto; Kenneth Noboru | Integrated computer display and graphical input apparatus and method |
US5826962A (en) * | 1997-04-30 | 1998-10-27 | Minnesota Mining And Manufacturing Company | LCD integrated/overhead projector |
US5951015A (en) * | 1997-06-10 | 1999-09-14 | Eastman Kodak Company | Interactive arcade game apparatus |
JPH1178369A (en) * | 1997-09-03 | 1999-03-23 | Plus Kogyo Kk | Display system |
JP3794180B2 (en) * | 1997-11-11 | 2006-07-05 | セイコーエプソン株式会社 | Coordinate input system and coordinate input device |
US6247994B1 (en) * | 1998-02-11 | 2001-06-19 | Rokenbok Toy Company | System and method for communicating with and controlling toy accessories |
TW523627B (en) * | 1998-07-14 | 2003-03-11 | Hitachi Ltd | Liquid crystal display device |
US6612701B2 (en) * | 2001-08-20 | 2003-09-02 | Optical Products Development Corporation | Image enhancement in a real image projection system, using on-axis reflectors, at least one of which is aspheric in shape |
US6598976B2 (en) * | 2001-09-05 | 2003-07-29 | Optical Products Development Corp. | Method and apparatus for image enhancement and aberration corrections in a small real image projection system, using an off-axis reflector, neutral density window, and an aspheric corrected surface of revolution |
US6425668B1 (en) * | 2000-08-14 | 2002-07-30 | Dan Jacob | Tented art projector |
US7327376B2 (en) * | 2000-08-29 | 2008-02-05 | Mitsubishi Electric Research Laboratories, Inc. | Multi-user collaborative graphical user interfaces |
US6761634B1 (en) * | 2001-06-07 | 2004-07-13 | Hasbro, Inc. | Arcade table |
US6478432B1 (en) * | 2001-07-13 | 2002-11-12 | Chad D. Dyner | Dynamically generated interactive real imaging device |
US20030035086A1 (en) * | 2001-08-20 | 2003-02-20 | Robinson Douglas L. | Real image projection device incorporating e-mail register |
US6733140B2 (en) * | 2002-04-19 | 2004-05-11 | Optical Products Development Corp. | Method of ghost reduction and transmission enhancement for a real image projection system |
US6791575B2 (en) * | 2001-09-25 | 2004-09-14 | Wu Li Investments | Apparatus for providing an electronic display with selectable viewing orientations |
US6729739B2 (en) * | 2001-10-23 | 2004-05-04 | Radica Games, Ltd. | Folding fluorescent illumination system |
JP2003152851A (en) * | 2001-11-14 | 2003-05-23 | Nec Corp | Portable terminal |
US7553236B2 (en) * | 2002-02-13 | 2009-06-30 | Parra Anthony C | Casino gaming station |
US7283126B2 (en) * | 2002-06-12 | 2007-10-16 | Smart Technologies Inc. | System and method for providing gesture suggestions to enhance interpretation of user input |
US7775883B2 (en) * | 2002-11-05 | 2010-08-17 | Disney Enterprises, Inc. | Video actuated interactive environment |
DE10260305A1 (en) * | 2002-12-20 | 2004-07-15 | Siemens Ag | HMI setup with an optical touch screen |
US20050012909A1 (en) * | 2003-02-21 | 2005-01-20 | Kokin Daniel E. | Projection system with flexible orientation |
TW200527110A (en) * | 2003-10-20 | 2005-08-16 | Johnson Res And Dev Co Inc | Portable multimedia projection system |
US7018053B2 (en) * | 2003-10-23 | 2006-03-28 | Hewlett-Packard Development Company, L.P. | Projector |
JP2005208991A (en) * | 2004-01-23 | 2005-08-04 | Canon Inc | Position information output device and signal processing method |
US7535481B2 (en) * | 2004-06-28 | 2009-05-19 | Microsoft Corporation | Orienting information presented to users located at different sides of a display surface |
AU2005282887B2 (en) * | 2004-09-01 | 2012-03-01 | Igt | Gaming system having multiple gaming devices that share a multi-outcome display |
-
2005
- 2005-11-04 WO PCT/US2005/039805 patent/WO2006065382A2/en active Application Filing
- 2005-11-04 CA CA002586386A patent/CA2586386A1/en not_active Abandoned
- 2005-11-04 JP JP2007539340A patent/JP2008518686A/en not_active Withdrawn
- 2005-11-04 CN CNA2005800377844A patent/CN101052446A/en active Pending
- 2005-11-04 MX MX2007005057A patent/MX2007005057A/en not_active Application Discontinuation
- 2005-11-04 AU AU2005317037A patent/AU2005317037A1/en not_active Abandoned
- 2005-11-04 EP EP05851329A patent/EP1807162A2/en not_active Withdrawn
- 2005-11-04 US US11/266,593 patent/US20060183545A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3857022A (en) * | 1973-11-15 | 1974-12-24 | Integrated Sciences Corp | Graphic input device |
US20030123032A1 (en) * | 1999-03-03 | 2003-07-03 | 3M Innovative Properties Company | Compact integrated front projection system |
US20040070625A1 (en) * | 2000-09-08 | 2004-04-15 | Albert Palombo | Multiuser electronic platform-screen, in particular for games, and method for controlling clerance for executing programmes such as games |
US20030137639A1 (en) * | 2002-01-21 | 2003-07-24 | Liang-Ta Yang | Projection device |
WO2003102895A1 (en) * | 2002-05-30 | 2003-12-11 | Mattel, Inc. | Interactive multi-sensory reading system electronic teaching/learning device |
Also Published As
Publication number | Publication date |
---|---|
US20060183545A1 (en) | 2006-08-17 |
JP2008518686A (en) | 2008-06-05 |
WO2006065382A3 (en) | 2006-09-08 |
AU2005317037A1 (en) | 2006-06-22 |
EP1807162A2 (en) | 2007-07-18 |
MX2007005057A (en) | 2007-06-19 |
CA2586386A1 (en) | 2006-06-22 |
CN101052446A (en) | 2007-10-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20060183545A1 (en) | Multi-user touch-responsive entertainment device | |
US6761634B1 (en) | Arcade table | |
US7907128B2 (en) | Interaction between objects and a virtual environment display | |
JP5226960B2 (en) | GAME DEVICE, VIRTUAL CAMERA CONTROL METHOD, PROGRAM, AND RECORDING MEDIUM | |
WO2005107911A1 (en) | Entertainment apparatus and operating method thereof | |
US20120007840A1 (en) | Processor-controlled object | |
WO2015113358A1 (en) | System and method for operating computer program with physical objects | |
JP5226961B2 (en) | GAME DEVICE, CHARACTER AND VIRTUAL CAMERA CONTROL METHOD, PROGRAM, AND RECORDING MEDIUM | |
US20080146303A1 (en) | Game for moving an object on a screen in response to movement of an operation article | |
US20160162040A1 (en) | System and method for operating a computer program with physical objects | |
EP2004299A2 (en) | Interactive playmat | |
US8517837B2 (en) | Control unit for a video games console provided with a tactile screen | |
EP2240250A1 (en) | Object, method and system for transmitting information to a user | |
TW201241682A (en) | Multi-functional position sensing device | |
JP6115746B2 (en) | Game device | |
JP5220272B2 (en) | GAME DEVICE AND GAME PROGRAM | |
US20070252327A1 (en) | Stepped Position Specifying Apparatus, Stepping Type Exercise Apparatus, Stepped Position Specifying Method and Exercising Support Method | |
EP3638386B1 (en) | Board game system and method | |
JP3824462B2 (en) | GAME DEVICE AND INFORMATION STORAGE MEDIUM | |
JP3815938B2 (en) | GAME DEVICE AND INFORMATION STORAGE MEDIUM | |
KR100347837B1 (en) | Apparatus of rythm and dance game machine and foot key | |
JP2017176470A (en) | Game device | |
JP2001017738A (en) | Game device | |
WO2005107884A1 (en) | Stepped position specifying apparatus, stepping type exercise apparatus, stepped position specifying method and exercising support method | |
JP3374484B2 (en) | Video game equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005851329 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005317037 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007539340 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/a/2007/005057 Country of ref document: MX |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580037784.4 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2586386 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2005317037 Country of ref document: AU Date of ref document: 20051104 Kind code of ref document: A |
|
WWP | Wipo information: published in national office |
Ref document number: 2005851329 Country of ref document: EP |