US20230115736A1 - Interactive environment with virtual environment space scanning - Google Patents
Interactive environment with virtual environment space scanning Download PDFInfo
- Publication number
- US20230115736A1 US20230115736A1 US18/080,678 US202218080678A US2023115736A1 US 20230115736 A1 US20230115736 A1 US 20230115736A1 US 202218080678 A US202218080678 A US 202218080678A US 2023115736 A1 US2023115736 A1 US 2023115736A1
- Authority
- US
- United States
- Prior art keywords
- image
- dimensional space
- virtual
- environment image
- detecting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 68
- 230000003993 interaction Effects 0.000 claims abstract description 45
- 230000004044 response Effects 0.000 claims abstract description 11
- 230000007246 mechanism Effects 0.000 claims description 45
- 238000000034 method Methods 0.000 claims description 20
- 230000033001 locomotion Effects 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 6
- 238000012876 topography Methods 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 abstract description 12
- 238000002360 preparation method Methods 0.000 description 31
- 230000006870 function Effects 0.000 description 7
- 210000003811 finger Anatomy 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000005465 channeling Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000005338 frosted glass Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 229910052734 helium Inorganic materials 0.000 description 1
- SWQJXJOGLNCZEY-UHFFFAOYSA-N helium atom Chemical group [He] SWQJXJOGLNCZEY-UHFFFAOYSA-N 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000010399 physical interaction Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/04—Dice; Dice-boxes; Mechanical dice-throwing devices
- A63F9/0468—Electronic dice; electronic dice simulators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F1/00—Card games
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F1/00—Card games
- A63F2001/008—Card games adapted for being playable on a screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2401—Detail of input, input devices
- A63F2009/2411—Input form cards, tapes, discs
- A63F2009/2419—Optical
- A63F2009/2425—Scanners, e.g. for scanning regular characters
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2448—Output devices
- A63F2009/245—Output devices visual
- A63F2009/2461—Projection of a two-dimensional real image
- A63F2009/2463—Projection of a two-dimensional real image on a screen, e.g. using a video projector
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2250/00—Miscellaneous game characteristics
- A63F2250/30—Miscellaneous game characteristics with a three-dimensional image
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/69—Involving elements of the real world in the game world, e.g. measurement in live races, real video
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/04—Dice; Dice-boxes; Mechanical dice-throwing devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the '617 Application is a continuation-in-part of U.S. patent application Ser. No. 14/462,750, filed on Aug. 19, 2014 and titled PROJECTION OF INTERACTIVE ENVIRONMENT (“the '750 Application”), now U.S. Pat. No. 9,550,124, issued Jan. 24, 2017.
- the '750 Application is a continuation of U.S. patent application Ser. No. 13/547,626, filed on Jul. 12, 2012 and titled PROJECTION OF INTERACTIVE GAME ENVIRONMENT (“the '626 Application”), now U.S. Pat. No. 8,808,089, issued Aug. 19, 2014.
- the '626 Application is a continuation-in-part of U.S. patent application Ser.
- Embodiments described herein relate to the projection of an image and to interaction with the projected image.
- the image may be a two-dimensional image, or it may be three-dimensional.
- Data is received that represents virtual objects that are spatially positioned in virtual environment space.
- An image is then projected to display a visual representation of all or a portion of the virtual environment space, including one or more of the virtual objects within the virtual environment space.
- the system may then detect user interaction with the locations of one or more of the virtual objects in the virtual environment space, as seen in the visual representation of the virtual environment space displayed by the image that has been projected and, in response thereto, change the image that is projected. That interaction may be via an input device, or even more directly via interaction with the projected image.
- the user might interact with a virtual object within the virtual environment space, or with a physical object (e.g., a game piece, a game board, etc.) that is within the virtual environment space visually represented by the image.
- a user may interact with visualized representations of virtual environment space, enabling complex and interesting interactivity scenarios and applications.
- Such a system which may also be referred to herein as a “projection system,” may be capable of modifying an image based on that interaction, and may be capable of projecting the modified image to display the result(s) of such interaction.
- FIG. 1 abstractly illustrates an embodiment of a distributed system that includes an embodiment of an interactive projection system
- FIG. 2 abstractly illustrates an interactive image projection system that represents an embodiment of the interactive image projection system of FIG. 1 ;
- FIG. 3 illustrates an example embodiment of a virtual environment space that includes virtual objects
- FIG. 4 abstractly illustrates an image generation system with which the interactive image projection system may operate
- FIG. 5 abstractly illustrates an embodiment of an input device of the embodiment of distributed system of FIG. 1 ;
- FIG. 6 illustrates a specific embodiment of an input device
- FIG. 7 illustrates another specific embodiment of an input device
- FIG. 8 is a flowchart of a method for projecting and detecting interaction with a projected image
- FIGS. 9 and 10 illustrate specific embodiments of interactive image projection systems
- FIG. 11 illustrates a computing system architecture in which the principles described herein may be employed in at least some embodiments.
- the principles described herein relate to the projection of an image to an interactive environment.
- the image may be two-dimensional or it may be three-dimensional.
- the image may include one or more virtual objects that are spatially positioned within a virtual environment space.
- the image is projected to provide a visual representation of all or a portion of the virtual environment space, including a visual representation of one or more of the virtual objects.
- the interactive image projection system may then detect user interaction with the image and, thus, with the visual representation of the virtual environment space and, in response to any detected interaction, change the image, and perhaps cause a change (e.g., a temporary change, a permanent change, etc.) to a state of a program or an application corresponding to the image that has been projected (e.g., for which the image provides a graphical user interface (GUI), etc.).
- a change e.g., a temporary change, a permanent change, etc.
- GUI graphical user interface
- FIG. 1 abstractly illustrates a distributed system 100 .
- the distributed system 100 includes an interactive image projection system 101 .
- the interactive image projection system 101 projects an image 111 .
- a user may interact with the image 111 that has been projected.
- the image 111 may be projected onto a surface.
- the surface may be opaque or translucent (e.g., frosted glass, etc.).
- the surface may be a substantially horizontal surface, in which case the image 111 may be projected, at least in part, downward onto the surface.
- the substantially horizontal surface may be a table top, a counter top, a floor, a game board, or any other surface that is oriented substantially horizontally.
- a “substantially horizontal” surface may be any surface that is within 30 degrees of horizontal.
- a “more precisely horizontal” surface may be any surface that is within 5 degrees of horizontal.
- the surface may be a substantially vertical surface.
- a “substantially vertical” surface may be any surface that is within 30 degrees of vertical.
- a “more precisely vertical” surface may be any surface that is within 5 degrees of vertical.
- the surface may be oriented orthogonally (i.e., at a non-parallel, non-perpendicular angle to the horizon).
- the image 111 may be projected onto a more complex surface.
- the surface onto which the image 111 is projected may include one or more substantially horizontal surfaces, one or more substantially vertical surfaces, and/or one/or more orthogonally oriented surfaces.
- the complex surface might include, as a substantially horizontal surface, all or part of a surface of a floor, a table, a game board, or the like, and, as a substantially vertical surface, all or part of a wall, a projection screen, a white board, or the like.
- Other examples of complex surface may include textured surfaces, curved surfaces, and surfaces with nonplanar topographies.
- the image 111 represents an interactive environment space in which one or more users may interact with the image 111 or features thereof (e.g., virtual objects, etc.). One or more users may interact with the image 111 manually, with physical objects, and/or with one or more input devices.
- the image 111 might be projected to a collaborative area, a work area, or any other type of interactive area.
- the distributed system 100 is often described as being a game or as being used in conjunction with a game. In those cases, each user would be a player, and the interactive environment space to which the image 111 is projected would be an interactive play space.
- the principles described herein may apply to any environment in which one or more users interact with a projected image.
- FIG. 1 is in abstract, the interactive image projection system 101 and the image 111 are only depicted as abstract representations. Subsequent figures will illustrate more specific embodiments of the interactive image projection system 101 and the image 111 .
- the distributed system 100 may also include surrounding control devices, which are also referred to herein as “input devices.”
- input devices 102 A-H there are eight input devices 102 A-H illustrated in FIG. 1 , although the ellipses 1021 represent that a distributed system 100 may include fewer than eight input devices 102 A-H or more than eight input devices 102 A-H.
- the input devices 102 A-H are represented abstractly as rectangles, although each will have a particular concrete form depending on its function and design. Example forms of input devices 102 A-H are described in further detail below. In the context of a game, for example, the input devices 102 A-H may be player consoles. However, the inclusion of one or more input devices 102 A-H in a distributed system 100 is optional.
- each user may instead provide input through direct, physical interaction with a three-dimensional space adjacent to a location to which the image 111 is projected.
- Such direct interaction may be provided for example, with a hand and one or more fingers, by manipulating physical objects (e.g., game pieces, etc.) positioned in relation to the image 111 , or, perhaps, by rolling dice or playing cards in association with the image 111 .
- the interactive image projection system 101 is capable of responding to multiple simultaneous instances of users interacting with a location to which the image 111 is projected.
- input into the distributed system 100 may be achieved using one or more input devices 102 A-H and/or by direct interaction with the interactive environment image 111 .
- a user may affect the state of the image 111 and/or of a program or an application associated with the image 111 (e.g., for which the image provides a GUI).
- one, some, or even all of the input devices 102 A-H are wireless.
- the wireless input device 102 A-H may communicate wirelessly with the interactive image projection system 101 .
- One or even some of the input devices 102 A-H may be located remotely from the image 111 .
- Such remotely located game input device(s) 102 A-H may communicate with the interactive image projection system 101 over a Wide Area Network (WAN), such as the Internet. That would enable a user to interact with the image 111 remotely, even if that user is not located in proximity to the image 111 .
- WAN Wide Area Network
- a father or mother stationed overseas might play a child's favorite board game with their child before going to bed.
- game input devices 102 A-H may be local (e.g., in the same room, etc.) to the interactive image projection system 101 .
- FIG. 2 abstractly illustrates an interactive image projection system 200 that represents an embodiment of the interactive image projection system 101 of FIG. 1 .
- the interactive image projection system 200 is illustrated as including an output channel 210 that projects an image (e.g., the image 111 of FIG. 1 ).
- the output channel 210 includes several functions including image preparation and projection. Image preparation is performed by an image preparation mechanism 211 , and projection of the image 111 is performed by projector(s) 212 A, 212 B, etc., with one projector 212 A being depicted and the ellipses 212 B representing one or more optional additional projectors in the output channel 210 of the interactive image projection system 200 .
- the image preparation mechanism 211 receives an input image 201 and supplies an output image 202 in response to receiving the input image 201 .
- the input image 201 may be provided by any image generator.
- the input image 201 might be provided by a video game console, a rendering program (whether two dimensional or three-dimensional), or any other module, component or software, that is capable of generating an image.
- the input image 201 represents one or more virtual objects that are positioned in a virtual environment space.
- the virtual environment space may represent a battleground with specific terrain.
- the battleground is represented in a computer, and need not represent any actual battleground.
- Other examples of virtual environment space might include a three-dimensional representation of the surface of the Moon, a representation of a helium atom, a representation of a crater of a fictional planet, a representation of a fictional spacecraft, a representation of outer space, a representation of a fictional subterranean cave network, and so forth.
- the virtual environment space may be embodied by a computer program or an application.
- Virtual objects may be placed in the virtual environment space by a computer program or an application, and may represent any object, real or imagined.
- a virtual object might represent a soldier, a tank, a building, a fictional anti-gravity machine, or any other possible object, real or imagined.
- FIG. 3 illustrates an example of a virtual environment space 300 .
- the virtual environment space 300 includes virtual objects 301 , 302 , 303 , and 304 .
- the virtual environment space 300 is a three-dimensional space, such that the virtual objects 301 , 302 , 303 , and 304 are represented as three-dimensional objects having specific shapes, positions, and/or orientations within the virtual environment space 300 .
- This virtual environment space 300 may be used in order to formulate a representation of a certain portion and/or perspective of the virtual environment space 300 .
- the output image 202 includes a visual representation of at least part of the virtual environment space 300 , the visual representation includes a visual representation of at least one of the virtual objects 301 , 302 , 303 , and 304 .
- the output image 202 may provide a visual representation of at least a portion of that crater, with virtual objects 301 , 302 , 303 , and 304 that might include several crater monsters, soldiers that are members of the same team, weapons that are strewn about and ready to be picked up, and so forth.
- the visual representation might be a portion of the city and include virtual objects 301 , 302 , 303 , and 304 that comprise things like vehicles, buildings, people, and so forth.
- the image preparation mechanism 211 may perform any processing on the input image 201 to generate the output image 202 that is ultimately projected by the one or more projectors 212 A, 212 B.
- the image preparation mechanism 211 may simply pass the input image 201 through, such that the output image 202 is identical to the input image 201 .
- the image preparation mechanism 211 might also change the format of the input image 201 , change the resolution of the input image 201 , compress the input image 201 , decrypt the input image 201 , select only a portion of the input image 201 , or the like.
- the image preparation mechanism 211 may select which portion of the input image 201 (i.e., a “subimage”) is to be projected by each projector 212 A, 212 B, such that when the output images 202 are projected by each projector 212 A, 212 B, the collective whole of all of the output images 202 may appear as a single image at the location to which the output images 202 are projected, a process referred to herein as “stitching.”
- the image preparation might also take into consideration appropriate adjustments given the surface on which the output image 202 is to be projected, or any intervening optics. For instance, if the output image 202 is to be projected onto a complex surface, the image preparation mechanism 211 may adjust the input image 201 such that the output image 202 will appear properly on the complex surface. The user might configure the image preparation mechanism 211 with information regarding the complex surface. Alternatively, or in addition, the interactive image projection system 200 may be capable of entering a discovery phase upon physical positioning that identifies the characteristics of any surface onto which an output image 202 is to be projected, in relation to the projector(s) 212 A, 212 B.
- the image preparation mechanism 211 may take into consideration the distances to the surfaces and the angles at which the surfaces are oriented to make sure that the output image 202 appears proportional and as intended on each surface. Thus, the image preparation mechanism 211 may make appropriate geometrical adjustments to the input image 201 so that the output image 202 appears properly on each surface.
- Other examples of complex surfaces include spherical surfaces, full and partial cylindrical surfaces, surfaces that include convex portions, surfaces that include concave portions, other curved surfaces, including surfaces with repeated curvatures or other complex curvatures, and surfaces that represent a nonplanar topography (as in a complex terrain with various peaks and valleys). In cases in which the output image 202 is to pass through optics such as lens and mirrors, the image preparation mechanism 211 may consider the presence of such optics and modify the input image 201 accordingly.
- the interactive image projection system 200 may also output various signals. For instance, the interactive image projection system 200 may output audio, such as audio that corresponds to the input image 201 .
- the interactive image projection system 200 may output wired or wireless signals to the input devices 102 A-H, perhaps causing some private state to be altered at the input devices 102 A-H.
- the interactive image projection system 200 may dispatch information in a wired or wireless fashion to the central display.
- User input may be provided through interaction with an input device (such as one of the input devices 102 A-H of FIG. 1 ) and/or through direct interaction of a real object (such as a human finger, a game piece, a game board, a central display or the like) with a three-dimensional space adjacent to a location where the output image 202 is projected. If there is to be direct interaction to provide input, the interactive image projection system 200 may also include an input channel 220 .
- an input device such as one of the input devices 102 A-H of FIG. 1
- a real object such as a human finger, a game piece, a game board, a central display or the like
- the interactive image projection system 200 may also include an input channel 220 .
- the input channel 220 may include a scanning mechanism 221 capable of scanning a three-dimensional space adjacent to a location where the output image 202 is projected to determine whether or not a user is interacting with a virtual object displayed by or as part of the output image 202 or with another object (e.g., a physical object, etc.) that that may be used in conjunction with the output image 202 . More specifically, the scanning mechanism 221 may detect movement of a manipulating element (e.g., a physical object, such as finger, a thumb, a hand, an object held by a user, etc.) into and within the three-dimensional space adjacent to a location where the output image 202 is projected.
- a manipulating element e.g., a physical object, such as finger, a thumb, a hand, an object held by a user, etc.
- the output image 202 of FIG. 2 includes just two-dimensional information.
- the projector(s) 212 A, 212 B project(s) the frame of the output image 202 or each output image 202 .
- the scanning mechanism 221 may scan the area where the last frame or output image 202 was projected. This projection and scanning process is then repeated for the next frame output image 202 , and for the subsequent frame or output image 202 , and so on.
- projection and scanning may not happen at the same time (with scanning happening between projection of sequential frames or an output image 202 or between projection of sequential output images 202 ), they happen at such a high frequency that the output image(s) 202 may seem to have continuous motion. Furthermore, even though the frame or output image 202 may not always be present, the period of time that the frame or the output image 202 is not present may be so short, and occur at a frequency that it may provide a human observer with the illusion that the frame or output image 202 is always present. Thus, real objects may have the appearance of occupying the same space as the output image(s) 202 . Alternatively, the scanning mechanism 221 may operate while an output image 202 or a frame thereof is projected and displayed.
- the output image 202 of FIG. 2 may represent three-dimensional information.
- the projector(s) 212 A, 212 B may project a left eye image intended for the left eye, and a right eye image intended for the right eye.
- the output image 202 can be observed by the human mind as being truly three dimensional.
- Three-dimensional glasses are an appropriate aid for enabling this kind of eye-specific light channeling, but the principles of the present invention are not limited to the type of aid used to allow a human observer to conceptualize three-dimensional image information.
- projection of the left eye image and projection of the right eye image are interlaced, with each being displayed at a frequency at which continuous motion is perceived by a human observer.
- a human observer cannot distinguish discrete changes, but instead perceives continuous motion, between frames that are output at a frequency of at least 44 frames per second.
- a system that operates at 120 Hz, and which interlaces a left eye image and a right eye image, each at 60 Hz, will suffice to formulate the appearance of continuous three-dimensional motion.
- the scanning mechanism 221 may scan for any objects (e.g., manipulating elements, etc.) that move into and/or within a three-dimensional space adjacent to a location where the output image 202 is projected in a manner that may comprise interaction of such an object with the output image 202 .
- the scanning may also occur at a frequency of 120 Hz, at a frequency of 60 Hz, or at some other interval. That said, the principles described herein are not limited to any particular frame rate for projection and sampling rate for scanning.
- the input channel 220 of the interactive image projection system 200 may also include an input preparation function provided by, for example, an input preparation mechanism 222 .
- This input preparation mechanism 222 may take the input provided through the scanning process and provide it in another form recognizable by a system that generates an input image 201 (such as perhaps by a conventional video game system).
- the input preparation mechanism 222 may receive information from the scanning mechanism 221 that allows the input preparation mechanism 222 to recognize gestures and interaction with virtual objects that are displayed and that may be visualized by one or more users.
- the input preparation mechanism 222 might recognize the gesture, and correlate that gesture to particular input.
- the input preparation mechanism 222 may consider the surface configuration, as well as any optics (such as mirrors or lenses) that may intervene between the location(s) to which the output image(s) 202 is (are) projected and the scanning mechanism 221 .
- the output image 202 is of a game board, with virtual game pieces placed on the game board.
- the user might reach into the output image 202 to the location of a virtual game piece (e.g., by simulated touching since the virtual game piece cannot be touched or otherwise physically contacted), and “move” that virtual game piece from one location of the game board to another, thereby advancing the state of the game, perhaps permanently.
- the movement may occur over the course of dozens or even hundreds of frames or output images 202 , which, from the user's perspective, occurs in a moment.
- the input preparation mechanism 222 recognizes that a physical object (e.g., a manipulation element, etc., such as a human finger) has reached into a three-dimensional space adjacent to a location to which the output image 202 is projected, and extended to or adjacent to the location where the virtual game piece appears. If the image were a three-dimensional image, the input preparation mechanism 222 could monitor the position of the physical object in three-dimensional space relative to a three-dimensional position of the virtual game piece.
- the virtual game piece may comprise a projected portion of the output image 202 and, thus, the user would not feel the virtual game piece, but the input preparation mechanism 222 may recognize that the user has indicated an intent to perform some action on or with the virtual game piece.
- the input preparation mechanism 222 may recognize slight incremental movement of the physical object, which may represent an intent to interact with the output image 202 or a feature of the output image 202 (e.g., to move a virtual game piece in the same direction and magnitude as the finger moved, to otherwise manipulate the output image 202 , etc.) and, optionally, an intent to interact with the output image 202 or a feature thereof in a particular manner.
- the input preparation mechanism 222 may issue commands to cause the image preparation mechanism 211 to modify the output image 202 (now an input image 201 ) in an appropriate manner (e.g., to cause the virtual game piece to move in the virtual environment space, etc.).
- the changes can be almost immediately observed in the next frame or in the next output image 202 . This occurs for each frame or output image 202 until the user indicates an intent to no longer move the game piece (e.g., by tapping a surface on which the output image 202 is projected at the location at which the user wishes to deposit the virtual game piece, etc.).
- the interactive image projection system 200 may enable the projection and movement of virtual objects or otherwise enable the projection and manipulation of a virtual environment space. Other actions might include resizing, re-orienting, changing the form, or changing the appearance of one or more virtual objects with which a user interacts.
- the user may interact with physical objects associated with an image that has been projected.
- the input channel 220 may recognize the position, orientation, and/or configuration of the physical object and interpret user movements and/or gestures (e.g., movement or manipulation of a physical object, etc.) or interaction with virtual features associated with the physical object.
- a physical game board may be placed within a projected image that might include virtual objects, such as, virtual “Chance” and “Community Chest” cards, virtual houses and hotels, and perhaps a combination of real and virtual game pieces (according to player preference configured at the beginning of a game).
- a player might tap on a property owned by that player, which the input channel may interpret as an intent to build a house on the property.
- the input channel 220 might then coordinate with any external image generation system and the output channel 210 to cause an additional virtual house to appear on the property (with perhaps some animation).
- the input channel 220 may coordinate to debit the account of that player by the cost of a house.
- information may be transmitted to a personal input device 102 A-H operated by the user to update an account balance displayed by the personal input device 102 A-H.
- the player might roll actual dice at the beginning of the player's turn.
- the input channel 220 may recognize the numbers on the dice after they have been rolled and cause the projected image to highlight the position that the player's game piece should move to. If the player has a virtual game piece, then the system might automatically move (with perhaps some animation) the virtual game piece, or perhaps have the user move with the player's interaction with the virtual game piece (perhaps configured by the user to suit his/her preference).
- the interactive image projection system 200 might transmit a prompt to the user's input device 102 A-H, requesting whether the user desires to purchase the property, or notifying the user of rent owed.
- the output channel 210 not only projects images, but also responds to an external game system to provide appropriate output to appropriate devices. For instance, the output channel 210 might recognize that the external game system is providing the current player with an inquiry as to whether or not the current player wants to purchase the property.
- the output channel 210 in addition to projecting the appropriate image, may also transmit an appropriate prompt to the player's input device 102 A-H.
- a central display may display an image and be positioned within an image that has been projected by the interactive image projection system 101 .
- a projected image may be superimposed with an image displayed by the central display.
- the principles described herein may take a conventional system and allow for a unique interaction with a projected image.
- the interactive image projection system 200 may interface with a conventional image generation system (e.g., a graphic processor, etc.) to enable interaction with an image that has been projected.
- the interactive image projection system 200 may receive an image generated by the conventional image generation system, with the image preparation mechanism 211 conducting any processing of any interaction by a user with the projected image.
- the conventional image generation system may generate the image in the same manner as if the image were just to be displayed by a conventional display or projector.
- the conventional image generation system receives commands from the image preparation mechanism 211 as it is accustomed to receive commands from conventional input devices to effect a change in the game state of a program or an application for which a GUI has been displayed (i.e., projected) and advance use of the program or the application.
- the conventional image generation system may operate in the same manner it would normally function in response to conventional inputs, no matter how complex the systems used to generate the commands. Whether the input was generated by a conventional hand-held controller, or through the complexity of the input channel 220 , the conventional image generation system will operate in its intended manner.
- the input channel 220 may provide information for other surrounding devices, such as, any of one or more conventional input devices, and perhaps a central display, associated with the conventional image generation system, thereby altering state of any of these devices, and allowing for these devices to participate in interacting with the program or the application whose outputs are being interactively projected.
- other surrounding devices such as, any of one or more conventional input devices, and perhaps a central display, associated with the conventional image generation system, thereby altering state of any of these devices, and allowing for these devices to participate in interacting with the program or the application whose outputs are being interactively projected.
- FIG. 4 abstractly illustrates an image generation system 400 , which may be used to generate the input image 201 of FIG. 2 .
- the image generation system 400 may be a conventional video game that outputs an image that might, for example, change as a player progresses through the video game.
- the functions described as being included within the image generation system 400 may be performed instead within the interactive image projection system 101 .
- the image generation system 400 includes logic 411 , an image generation mechanism 412 , and an input interface 413 .
- the logic 411 and/or the image generation mechanism 412 control a virtual environment space.
- the image generation mechanism 412 generates an image that is appropriate given a current state 414 of the logic 411 and, thus, of the virtual environment space.
- the input interface 413 receives commands that may alter the state 414 of virtual environment space and, thus, of the logic 411 , thereby potentially affecting the image generated by the image generation mechanism 412 .
- the state 414 may even be altered from one stage to the next as one or more users interact with a program or an application through the input interface 413 . In such systems, images can be generated at such a rate that continuous motion is perceived.
- the bi-directional channel may be wired or wireless, or perhaps wired in one direction and wireless in another. Input commands are typically less data-intensive as compared to images, and thus the channel of communication 1108 from the interactive image projection system 200 to the image generation system 400 may be wireless.
- the channel of communication 1108 from the image generation system 400 to the interactive image projection system 200 may also be wireless provided that the bandwidth of the channel in that direction is sufficient.
- the interactive image projection system 101 and/or any associated input devices 102 A-H may have built-in microphones to allow sound data (e.g., the player's voice, etc.) to be input into the image generation system 400 to affect the state 414 . There may also be voice recognition capability incorporated into the interactive image projection system 101 and/or any associated input devices 102 A-H to permit such sound data to be converted to more usable form. Speakers, headset ports, and earpieces may be incorporated into the interactive image projection system 101 and/or into any input devices 102 A-H associated with the interactive image projection system 101 .
- FIG. 5 abstractly illustrates an embodiment of a player console 500 .
- the input devices 102 A-H of FIG. 1 may be player consoles in the context in which the distributed system 100 is a game environment.
- FIG. 5 is an abstract illustration of a player console 500 showing functional components of the player console 500 .
- Each player, or perhaps each team of players, may have an associated player console, each associated with the corresponding player or team.
- the player console 500 includes a private display area 501 and game logic 502 capable of rendering at least a portion a private portion of game state 503 associated with the player (or team).
- the player or team may use an input mechanism 504 to enter control input into the player console 500 .
- a transmission mechanism illustrated in the form of a transceiver 505 transmits that control input to the interactive image projection system 200 of FIG. 2 and/or to the image generation system 400 of FIG. 4 , where the control input is used to alter the state 414 of the logic 411 used to generate the image.
- FIG. 6 illustrates a specific embodiment of a player console 600 .
- the private display area 601 displays the player's private information (in this case, several playing cards).
- the player console 600 also includes a barrier 602 to prevent other players from seeing the private game state displayed on the private display area 601 .
- the private display area 601 may be touch-sensitive, allowing the player to interact with physical gestures on the private display area 601 , thereby causing control information to update the rendering on the private display area 601 , and the game states on the player console 600 , as well as on the central display 101 .
- the private display area 601 may also display video images 603 A, 603 B, and 603 C of other players.
- FIG. 7 illustrates such a player console, which might be a game master console 700 , with which a game master may interface with the private viewing area to perhaps control game state.
- the game master may use physical gestures on a touch-sensitive display 701 of the game master console 700 to affect what is displayed within the image 111 .
- the game master might control what portions of the map are viewable in the image 111 .
- the game master might use the game master console 700 to control the effect of another player's actions on the operation of the game logic.
- the game master might also use the game master console 700 to create a scenario and to set up a game.
- FIG. 8 is a flowchart of a method 800 for projecting an image and for enabling interaction with the image.
- data representing one or more virtual objects that are spatially positioned in a virtual environment space is received.
- An example of such data is an image in which such virtual objects are represented.
- the image is then projected at reference 802 in response to the received data.
- the image may provide a visual representation of at least part of the virtual environment space.
- any user interaction with the visualized represent provided by the image may be detected. In response to that user interaction, the projected image is then altered at reference 804 .
- FIG. 9 illustrates an embodiment of an interactive image projection system 900 in which multiple modules 902 A through 902 E are mounted to a stand 901 .
- Each module 902 A through 902 E includes a projector and a corresponding camera (not shown) which would be in the lower surface of each module 902 A through 902 E.
- the projector projects the images downward towards a surface on which the stand 901 is situated. These projectors would each project a corresponding subimage that are each processed such that the projected image is stitched together to appear as a single image on or over the surface.
- the camera scans for user interaction in the area of the image that has been projected.
- FIG. 10 illustrates another embodiment of an interactive image projection system 1000 that includes a single projector.
- the interactive image projection system 1000 includes a housing that includes a rigid base 1001 situated on a substantially horizontal surface.
- a projector 1011 is capable of projecting an image upward through a lens to a curved mirror 1012 , from which the image is reflected and projected through windows 1013 , and the projected downward onto the substantially horizontal surface on which the base 1001 is placed.
- the images are generated to account for the intervening lens(es), mirror(s) 1012 , and window(s) 1013 used to project the image.
- Four cameras (of which three 1021 A through 1021 C are visible in FIG. 10 ) are positioned around the upper circumference of the interactive image projection system 1000 .
- Such cameras 1021 A through 1021 C are capable of scanning a three-dimensional space adjacent to a location to which the image is projected to detect any interaction with the image.
- the various operations and structures described herein may, but need not, be implemented by way of a physical computing system. Accordingly, to conclude this description, an embodiment of a computing system will be described with respect to FIG. 11 .
- the computing system 1100 may be incorporated within the interactive image projection system 101 , within one or more of the input devices 102 A-H, and/or within the image generation system 400 .
- Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered to be computing systems.
- the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one processor and memory capable of having thereon computer-executable instructions that may be executed by the processor(s).
- the memory may take any physical form and may depend on the nature and form of the computing system.
- a computing system 1100 may communicate with other devices, including, but not limited to other computing systems, over a network environment 1110 , which may include multiple computing systems. In some embodiments, components of a single computing system 1100 may be distributed over a network environment 1110 .
- a computing system 1100 may include at least one processor 1102 and memory 1104 .
- the memory 1104 may comprise a physical system memory, which may be volatile, non-volatile, or some combination of the two.
- the term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If components of the computing system 1100 are distributed over a network environment 1110 , the processor 1102 , memory 1104 , and/or storage capability may be distributed as well.
- the term “module” or “component” can refer to software objects or routines that execute on the computing system 1100 .
- the different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads, etc.).
- embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system 1100 that performs the act direct the operation of the computing system 1100 in response to having executed computer-executable instructions.
- An example of such an operation involves the manipulation of data.
- the computer-executable instructions (and the manipulated data) may be stored in the memory 1104 of the computing system 1100 .
- Embodiments within the scope of the present invention also include computer-readable media for carrying or having computer-executable instructions or data structures stored thereon.
- Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer.
- Such computer-readable media can comprise physical storage and/or memory media such as RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other physical medium which can be used to carry or store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
- the components of the computing system 1100 may, for example, be used to provide functionality to game logic, store or remember game state, configure and communicate between devices, and operate the logic of game incorporation.
- Each of the player consoles may also have a computing system such as computing system 1100 guiding their processing needs.
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 17/182,909, filed on Feb. 23, 2021 and titled INTERACTIVE ENVIRONMENT WITH THREE-DIMENSIONAL SCANNING, (“the '909 Application”), now U.S. Pat. No. 11,526,238, issued Dec. 13, 2022, which is a continuation of U.S. patent application Ser. No. 16/883,972, filed on May 26, 2020 and titled INTERACTIVE ENVIRONMENT WITH THREE-DIMENSIONAL SCANNING, (“the '972 Application”), now U.S. Pat. No. 10,928,958, issued Feb. 23, 2021, which is a continuation of U.S. patent application Ser. No. 16/519,593, filed on Jul. 23, 2019 and titled PROJECTED, INTERACTIVE ENVIRONMENT, (“the '593 Application”), now U.S. Pat. No. 10,664,105, issued May 26, 2020, which is a continuation of U.S. patent application Ser. No. 15/980,638, filed on May 15, 2018 and titled PROJECTED, INTERACTIVE ENVIRONMENT (“the '638 Application”), now U.S. Pat. No. 10,359,888, issued Jul. 23, 2019. The '638 Application is a continuation of U.S. patent application Ser. No. 15/414,617, filed on Jan. 24, 2017 and titled PROJECTION OF INTERACTIVE ENVIRONMENT (“the '617 Application”), now U.S. Pat. No. 9,971,458, issued May 15, 2018. The '617 Application is a continuation-in-part of U.S. patent application Ser. No. 14/462,750, filed on Aug. 19, 2014 and titled PROJECTION OF INTERACTIVE ENVIRONMENT (“the '750 Application”), now U.S. Pat. No. 9,550,124, issued Jan. 24, 2017. The '750 Application is a continuation of U.S. patent application Ser. No. 13/547,626, filed on Jul. 12, 2012 and titled PROJECTION OF INTERACTIVE GAME ENVIRONMENT (“the '626 Application”), now U.S. Pat. No. 8,808,089, issued Aug. 19, 2014. The '626 Application is a continuation-in-part of U.S. patent application Ser. No. 12/855,604, filed on Aug. 12, 2010 and titled PROJECTION OF INTERACTIVE GAME ENVIRONMENT (“the '604 Application”), abandoned. The '604 Application is a continuation-in-part of U.S. patent application Ser. No. 12/651,947, filed on Jan. 4, 2010 and titled ELECTRONIC CIRCLE GAME SYSTEM (“the '947 Application”), abandoned. The '947 Application is a continuation-in-part of U.S. patent application Ser. No. 12/411,289, filed on Mar. 25, 2009 and titled WIRELESSLY DISTRIBUTED ELECTRONIC CIRCLE GAMING (“the '289 Application”), abandoned.
- The entire disclosures of the '909 Application, the '972 Application, the '593 Application, the '638 Application, the '617 Application, the '750 Application, the '626 Application, the '604 Application, the '947 Application and the '289 Application are, by this reference, incorporated herein.
- Embodiments described herein relate to the projection of an image and to interaction with the projected image. The image may be a two-dimensional image, or it may be three-dimensional. Data is received that represents virtual objects that are spatially positioned in virtual environment space. An image is then projected to display a visual representation of all or a portion of the virtual environment space, including one or more of the virtual objects within the virtual environment space. The system may then detect user interaction with the locations of one or more of the virtual objects in the virtual environment space, as seen in the visual representation of the virtual environment space displayed by the image that has been projected and, in response thereto, change the image that is projected. That interaction may be via an input device, or even more directly via interaction with the projected image. In the case of direct interaction, the user might interact with a virtual object within the virtual environment space, or with a physical object (e.g., a game piece, a game board, etc.) that is within the virtual environment space visually represented by the image. Thus, a user may interact with visualized representations of virtual environment space, enabling complex and interesting interactivity scenarios and applications.
- Systems that project images that represent virtual environment spaces and virtual objects themselves, and that detect interaction with one or more of the virtual objects are also disclosed. Such a system, which may also be referred to herein as a “projection system,” may be capable of modifying an image based on that interaction, and may be capable of projecting the modified image to display the result(s) of such interaction.
- This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- In order to describe the manner in which the above-recited and other advantages and features can be obtained, a more particular description of various embodiments will be provided by reference to the accompanying drawings. Understanding that the drawings depict only sample embodiments and are not, therefore, to be considered to be limiting of the scope of any of the appended claims, the embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
-
FIG. 1 abstractly illustrates an embodiment of a distributed system that includes an embodiment of an interactive projection system; -
FIG. 2 abstractly illustrates an interactive image projection system that represents an embodiment of the interactive image projection system ofFIG. 1 ; -
FIG. 3 illustrates an example embodiment of a virtual environment space that includes virtual objects; -
FIG. 4 abstractly illustrates an image generation system with which the interactive image projection system may operate; -
FIG. 5 abstractly illustrates an embodiment of an input device of the embodiment of distributed system ofFIG. 1 ; -
FIG. 6 illustrates a specific embodiment of an input device; -
FIG. 7 illustrates another specific embodiment of an input device; -
FIG. 8 is a flowchart of a method for projecting and detecting interaction with a projected image; -
FIGS. 9 and 10 illustrate specific embodiments of interactive image projection systems; and -
FIG. 11 illustrates a computing system architecture in which the principles described herein may be employed in at least some embodiments. - The principles described herein relate to the projection of an image to an interactive environment. The image may be two-dimensional or it may be three-dimensional. The image may include one or more virtual objects that are spatially positioned within a virtual environment space. The image is projected to provide a visual representation of all or a portion of the virtual environment space, including a visual representation of one or more of the virtual objects. The interactive image projection system may then detect user interaction with the image and, thus, with the visual representation of the virtual environment space and, in response to any detected interaction, change the image, and perhaps cause a change (e.g., a temporary change, a permanent change, etc.) to a state of a program or an application corresponding to the image that has been projected (e.g., for which the image provides a graphical user interface (GUI), etc.).
- Although not required, the input mechanism may be especially useful in a
distributed system 100, such as a distributed electronic game system.FIG. 1 abstractly illustrates adistributed system 100. Thedistributed system 100 includes an interactiveimage projection system 101. The interactiveimage projection system 101 projects animage 111. Through unique features of thedistributed system 100 described hereinafter, a user may interact with theimage 111 that has been projected. - In some embodiments, the
image 111 may be projected onto a surface. The surface may be opaque or translucent (e.g., frosted glass, etc.). - The surface may be a substantially horizontal surface, in which case the
image 111 may be projected, at least in part, downward onto the surface. As an example, the substantially horizontal surface may be a table top, a counter top, a floor, a game board, or any other surface that is oriented substantially horizontally. A “substantially horizontal” surface may be any surface that is within 30 degrees of horizontal. A “more precisely horizontal” surface may be any surface that is within 5 degrees of horizontal. Alternatively, the surface may be a substantially vertical surface. A “substantially vertical” surface may be any surface that is within 30 degrees of vertical. A “more precisely vertical” surface may be any surface that is within 5 degrees of vertical. As another alternative, the surface may be oriented orthogonally (i.e., at a non-parallel, non-perpendicular angle to the horizon). - In another embodiment, the
image 111 may be projected onto a more complex surface. For instance, the surface onto which theimage 111 is projected may include one or more substantially horizontal surfaces, one or more substantially vertical surfaces, and/or one/or more orthogonally oriented surfaces. As an example, the complex surface might include, as a substantially horizontal surface, all or part of a surface of a floor, a table, a game board, or the like, and, as a substantially vertical surface, all or part of a wall, a projection screen, a white board, or the like. Other examples of complex surface may include textured surfaces, curved surfaces, and surfaces with nonplanar topographies. - The
image 111, as projected by the interactiveimage projection system 101, represents an interactive environment space in which one or more users may interact with theimage 111 or features thereof (e.g., virtual objects, etc.). One or more users may interact with theimage 111 manually, with physical objects, and/or with one or more input devices. Theimage 111 might be projected to a collaborative area, a work area, or any other type of interactive area. However, in the remainder of this description, the distributedsystem 100 is often described as being a game or as being used in conjunction with a game. In those cases, each user would be a player, and the interactive environment space to which theimage 111 is projected would be an interactive play space. The principles described herein may apply to any environment in which one or more users interact with a projected image. - Since
FIG. 1 is in abstract, the interactiveimage projection system 101 and theimage 111 are only depicted as abstract representations. Subsequent figures will illustrate more specific embodiments of the interactiveimage projection system 101 and theimage 111. - Optionally, the distributed
system 100 may also include surrounding control devices, which are also referred to herein as “input devices.” There are eightinput devices 102A-H illustrated inFIG. 1 , although theellipses 1021 represent that a distributedsystem 100 may include fewer than eightinput devices 102A-H or more than eightinput devices 102A-H. Theinput devices 102A-H are represented abstractly as rectangles, although each will have a particular concrete form depending on its function and design. Example forms ofinput devices 102A-H are described in further detail below. In the context of a game, for example, theinput devices 102A-H may be player consoles. However, the inclusion of one ormore input devices 102A-H in a distributedsystem 100 is optional. - As an alternative to providing input through the
input devices 102A-H, each user may instead provide input through direct, physical interaction with a three-dimensional space adjacent to a location to which theimage 111 is projected. Such direct interaction may be provided for example, with a hand and one or more fingers, by manipulating physical objects (e.g., game pieces, etc.) positioned in relation to theimage 111, or, perhaps, by rolling dice or playing cards in association with theimage 111. The interactiveimage projection system 101 is capable of responding to multiple simultaneous instances of users interacting with a location to which theimage 111 is projected. Thus, input into the distributedsystem 100 may be achieved using one ormore input devices 102A-H and/or by direct interaction with theinteractive environment image 111. Thus, a user may affect the state of theimage 111 and/or of a program or an application associated with the image 111 (e.g., for which the image provides a GUI). - In one embodiment, one, some, or even all of the
input devices 102A-H are wireless. In the case of awireless input device 102A-H, thewireless input device 102A-H may communicate wirelessly with the interactiveimage projection system 101. One or even some of theinput devices 102A-H may be located remotely from theimage 111. Such remotely located game input device(s) 102A-H may communicate with the interactiveimage projection system 101 over a Wide Area Network (WAN), such as the Internet. That would enable a user to interact with theimage 111 remotely, even if that user is not located in proximity to theimage 111. Thus, for example, a father or mother stationed overseas might play a child's favorite board game with their child before going to bed. Or perhaps former strangers and new friends from different cultures around the globe might engage in a game, potentially fostering cross-cultural ties while having fun. That said, perhaps all of thegame input devices 102A-H may be local (e.g., in the same room, etc.) to the interactiveimage projection system 101. In yet another embodiment, there are noinput devices 102A-H. Regardless of whether there areinput devices 102A-H or not, the user might directly interact with theimage 111. -
FIG. 2 abstractly illustrates an interactiveimage projection system 200 that represents an embodiment of the interactiveimage projection system 101 ofFIG. 1 . The interactiveimage projection system 200 is illustrated as including anoutput channel 210 that projects an image (e.g., theimage 111 ofFIG. 1 ). Theoutput channel 210 includes several functions including image preparation and projection. Image preparation is performed by animage preparation mechanism 211, and projection of theimage 111 is performed by projector(s) 212A, 212B, etc., with oneprojector 212A being depicted and the ellipses 212B representing one or more optional additional projectors in theoutput channel 210 of the interactiveimage projection system 200. - The
image preparation mechanism 211 receives aninput image 201 and supplies anoutput image 202 in response to receiving theinput image 201. Theinput image 201 may be provided by any image generator. As an example, theinput image 201 might be provided by a video game console, a rendering program (whether two dimensional or three-dimensional), or any other module, component or software, that is capable of generating an image. - The
input image 201 represents one or more virtual objects that are positioned in a virtual environment space. As an example, the virtual environment space may represent a battleground with specific terrain. The battleground is represented in a computer, and need not represent any actual battleground. Other examples of virtual environment space might include a three-dimensional representation of the surface of the Moon, a representation of a helium atom, a representation of a crater of a fictional planet, a representation of a fictional spacecraft, a representation of outer space, a representation of a fictional subterranean cave network, and so forth. Whether representing something real or imagined, the virtual environment space may be embodied by a computer program or an application. - Virtual objects may be placed in the virtual environment space by a computer program or an application, and may represent any object, real or imagined. For instance, a virtual object might represent a soldier, a tank, a building, a fictional anti-gravity machine, or any other possible object, real or imagined.
-
FIG. 3 illustrates an example of avirtual environment space 300. In this example, thevirtual environment space 300 includesvirtual objects virtual environment space 300 is a three-dimensional space, such that thevirtual objects virtual environment space 300. Thisvirtual environment space 300 may be used in order to formulate a representation of a certain portion and/or perspective of thevirtual environment space 300. Theoutput image 202, as projected, includes a visual representation of at least part of thevirtual environment space 300, the visual representation includes a visual representation of at least one of thevirtual objects virtual environment space 300 comprises the inside of a virtual crater, theoutput image 202 may provide a visual representation of at least a portion of that crater, withvirtual objects virtual environment space 300 were a city, the visual representation might be a portion of the city and includevirtual objects - With returned reference to
FIG. 2 , theimage preparation mechanism 211 may perform any processing on theinput image 201 to generate theoutput image 202 that is ultimately projected by the one ormore projectors 212A, 212B. As an example, theimage preparation mechanism 211 may simply pass theinput image 201 through, such that theoutput image 202 is identical to theinput image 201. Theimage preparation mechanism 211 might also change the format of theinput image 201, change the resolution of theinput image 201, compress theinput image 201, decrypt theinput image 201, select only a portion of theinput image 201, or the like. Ifmultiple projectors 212A, 212B are used, theimage preparation mechanism 211 may select which portion of the input image 201 (i.e., a “subimage”) is to be projected by eachprojector 212A, 212B, such that when theoutput images 202 are projected by eachprojector 212A, 212B, the collective whole of all of theoutput images 202 may appear as a single image at the location to which theoutput images 202 are projected, a process referred to herein as “stitching.” - The image preparation might also take into consideration appropriate adjustments given the surface on which the
output image 202 is to be projected, or any intervening optics. For instance, if theoutput image 202 is to be projected onto a complex surface, theimage preparation mechanism 211 may adjust theinput image 201 such that theoutput image 202 will appear properly on the complex surface. The user might configure theimage preparation mechanism 211 with information regarding the complex surface. Alternatively, or in addition, the interactiveimage projection system 200 may be capable of entering a discovery phase upon physical positioning that identifies the characteristics of any surface onto which anoutput image 202 is to be projected, in relation to the projector(s) 212A, 212B. As an example, if the surface includes a combination of horizontal, orthogonal, and/or vertical surfaces, theimage preparation mechanism 211 may take into consideration the distances to the surfaces and the angles at which the surfaces are oriented to make sure that theoutput image 202 appears proportional and as intended on each surface. Thus, theimage preparation mechanism 211 may make appropriate geometrical adjustments to theinput image 201 so that theoutput image 202 appears properly on each surface. Other examples of complex surfaces include spherical surfaces, full and partial cylindrical surfaces, surfaces that include convex portions, surfaces that include concave portions, other curved surfaces, including surfaces with repeated curvatures or other complex curvatures, and surfaces that represent a nonplanar topography (as in a complex terrain with various peaks and valleys). In cases in which theoutput image 202 is to pass through optics such as lens and mirrors, theimage preparation mechanism 211 may consider the presence of such optics and modify theinput image 201 accordingly. - In addition to image preparation and projection, the interactive
image projection system 200 may also output various signals. For instance, the interactiveimage projection system 200 may output audio, such as audio that corresponds to theinput image 201. The interactiveimage projection system 200 may output wired or wireless signals to theinput devices 102A-H, perhaps causing some private state to be altered at theinput devices 102A-H. In addition, if there is a central display that displays an image (e.g., the interactive central display described in the co-pending commonly assigned application Ser. No. 12/411,289, etc.) (hereinafter referred to simply as the “central display”), the interactiveimage projection system 200 may dispatch information in a wired or wireless fashion to the central display. - User input may be provided through interaction with an input device (such as one of the
input devices 102A-H ofFIG. 1 ) and/or through direct interaction of a real object (such as a human finger, a game piece, a game board, a central display or the like) with a three-dimensional space adjacent to a location where theoutput image 202 is projected. If there is to be direct interaction to provide input, the interactiveimage projection system 200 may also include aninput channel 220. - The
input channel 220 may include ascanning mechanism 221 capable of scanning a three-dimensional space adjacent to a location where theoutput image 202 is projected to determine whether or not a user is interacting with a virtual object displayed by or as part of theoutput image 202 or with another object (e.g., a physical object, etc.) that that may be used in conjunction with theoutput image 202. More specifically, thescanning mechanism 221 may detect movement of a manipulating element (e.g., a physical object, such as finger, a thumb, a hand, an object held by a user, etc.) into and within the three-dimensional space adjacent to a location where theoutput image 202 is projected. - As an example, suppose that the
output image 202 ofFIG. 2 includes just two-dimensional information. In that case, the projector(s) 212A, 212B project(s) the frame of theoutput image 202 or eachoutput image 202. Then, after that frame oroutput image 202 is projected, during a short period before the next frame oroutput image 202 is projected, thescanning mechanism 221 may scan the area where the last frame oroutput image 202 was projected. This projection and scanning process is then repeated for the nextframe output image 202, and for the subsequent frame oroutput image 202, and so on. Even though projection and scanning may not happen at the same time (with scanning happening between projection of sequential frames or anoutput image 202 or between projection of sequential output images 202), they happen at such a high frequency that the output image(s) 202 may seem to have continuous motion. Furthermore, even though the frame oroutput image 202 may not always be present, the period of time that the frame or theoutput image 202 is not present may be so short, and occur at a frequency that it may provide a human observer with the illusion that the frame oroutput image 202 is always present. Thus, real objects may have the appearance of occupying the same space as the output image(s) 202. Alternatively, thescanning mechanism 221 may operate while anoutput image 202 or a frame thereof is projected and displayed. - As another example, the
output image 202 ofFIG. 2 may represent three-dimensional information. In that case, for each frame of theoutput image 202 or from each sequence ofoutput images 202, the projector(s) 212A, 212B may project a left eye image intended for the left eye, and a right eye image intended for the right eye. When appropriate aids are present that allow the left eye of a human observer to receive the left eye image (but not the right eye image), and that allow the right eye of that same human observer to receive the right eye image (but not the left eye image), theoutput image 202 can be observed by the human mind as being truly three dimensional. Three-dimensional glasses are an appropriate aid for enabling this kind of eye-specific light channeling, but the principles of the present invention are not limited to the type of aid used to allow a human observer to conceptualize three-dimensional image information. - In one example, projection of the left eye image and projection of the right eye image are interlaced, with each being displayed at a frequency at which continuous motion is perceived by a human observer. Typically, an average human observer cannot distinguish discrete changes, but instead perceives continuous motion, between frames that are output at a frequency of at least 44 frames per second. Thus, a system that operates at 120 Hz, and which interlaces a left eye image and a right eye image, each at 60 Hz, will suffice to formulate the appearance of continuous three-dimensional motion. At periodic times, the
scanning mechanism 221 may scan for any objects (e.g., manipulating elements, etc.) that move into and/or within a three-dimensional space adjacent to a location where theoutput image 202 is projected in a manner that may comprise interaction of such an object with theoutput image 202. In an interactiveimage projection system 200 that operates at a frequency of 120 Hz, for example, the scanning may also occur at a frequency of 120 Hz, at a frequency of 60 Hz, or at some other interval. That said, the principles described herein are not limited to any particular frame rate for projection and sampling rate for scanning. - The
input channel 220 of the interactiveimage projection system 200 may also include an input preparation function provided by, for example, aninput preparation mechanism 222. Thisinput preparation mechanism 222 may take the input provided through the scanning process and provide it in another form recognizable by a system that generates an input image 201 (such as perhaps by a conventional video game system). For instance, theinput preparation mechanism 222 may receive information from thescanning mechanism 221 that allows theinput preparation mechanism 222 to recognize gestures and interaction with virtual objects that are displayed and that may be visualized by one or more users. Theinput preparation mechanism 222 might recognize the gesture, and correlate that gesture to particular input. Theinput preparation mechanism 222 may consider the surface configuration, as well as any optics (such as mirrors or lenses) that may intervene between the location(s) to which the output image(s) 202 is (are) projected and thescanning mechanism 221. - As an example, suppose that the
output image 202 is of a game board, with virtual game pieces placed on the game board. The user might reach into theoutput image 202 to the location of a virtual game piece (e.g., by simulated touching since the virtual game piece cannot be touched or otherwise physically contacted), and “move” that virtual game piece from one location of the game board to another, thereby advancing the state of the game, perhaps permanently. In that case, the movement may occur over the course of dozens or even hundreds of frames oroutput images 202, which, from the user's perspective, occurs in a moment. Theinput preparation mechanism 222 recognizes that a physical object (e.g., a manipulation element, etc., such as a human finger) has reached into a three-dimensional space adjacent to a location to which theoutput image 202 is projected, and extended to or adjacent to the location where the virtual game piece appears. If the image were a three-dimensional image, theinput preparation mechanism 222 could monitor the position of the physical object in three-dimensional space relative to a three-dimensional position of the virtual game piece. The virtual game piece may comprise a projected portion of theoutput image 202 and, thus, the user would not feel the virtual game piece, but theinput preparation mechanism 222 may recognize that the user has indicated an intent to perform some action on or with the virtual game piece. - In subsequent frames or
output images 202, theinput preparation mechanism 222 may recognize slight incremental movement of the physical object, which may represent an intent to interact with theoutput image 202 or a feature of the output image 202 (e.g., to move a virtual game piece in the same direction and magnitude as the finger moved, to otherwise manipulate theoutput image 202, etc.) and, optionally, an intent to interact with theoutput image 202 or a feature thereof in a particular manner. Theinput preparation mechanism 222 may issue commands to cause theimage preparation mechanism 211 to modify the output image 202 (now an input image 201) in an appropriate manner (e.g., to cause the virtual game piece to move in the virtual environment space, etc.). The changes can be almost immediately observed in the next frame or in thenext output image 202. This occurs for each frame oroutput image 202 until the user indicates an intent to no longer move the game piece (e.g., by tapping a surface on which theoutput image 202 is projected at the location at which the user wishes to deposit the virtual game piece, etc.). - The appearance to the player would be as though the player had literally contacted the virtual game piece and caused the virtual game piece to move, even though the virtual game piece is but a projection. Accordingly the interactive
image projection system 200 may enable the projection and movement of virtual objects or otherwise enable the projection and manipulation of a virtual environment space. Other actions might include resizing, re-orienting, changing the form, or changing the appearance of one or more virtual objects with which a user interacts. - As a further example, the user may interact with physical objects associated with an image that has been projected. The
input channel 220 may recognize the position, orientation, and/or configuration of the physical object and interpret user movements and/or gestures (e.g., movement or manipulation of a physical object, etc.) or interaction with virtual features associated with the physical object. For instance, in the MONOPOLY board game, a physical game board may be placed within a projected image that might include virtual objects, such as, virtual “Chance” and “Community Chest” cards, virtual houses and hotels, and perhaps a combination of real and virtual game pieces (according to player preference configured at the beginning of a game). A player might tap on a property owned by that player, which the input channel may interpret as an intent to build a house on the property. Theinput channel 220 might then coordinate with any external image generation system and theoutput channel 210 to cause an additional virtual house to appear on the property (with perhaps some animation). In addition, theinput channel 220 may coordinate to debit the account of that player by the cost of a house. In addition, information may be transmitted to apersonal input device 102A-H operated by the user to update an account balance displayed by thepersonal input device 102A-H. - As another example of the MONOPOLY board game, the player might roll actual dice at the beginning of the player's turn. The
input channel 220 may recognize the numbers on the dice after they have been rolled and cause the projected image to highlight the position that the player's game piece should move to. If the player has a virtual game piece, then the system might automatically move (with perhaps some animation) the virtual game piece, or perhaps have the user move with the player's interaction with the virtual game piece (perhaps configured by the user to suit his/her preference). In response, the interactiveimage projection system 200 might transmit a prompt to the user'sinput device 102A-H, requesting whether the user desires to purchase the property, or notifying the user of rent owed. In one embodiment, theoutput channel 210 not only projects images, but also responds to an external game system to provide appropriate output to appropriate devices. For instance, theoutput channel 210 might recognize that the external game system is providing the current player with an inquiry as to whether or not the current player wants to purchase the property. Theoutput channel 210, in addition to projecting the appropriate image, may also transmit an appropriate prompt to the player'sinput device 102A-H. - In yet a further example, a central display may display an image and be positioned within an image that has been projected by the interactive
image projection system 101. Thus, a projected image may be superimposed with an image displayed by the central display. - In some embodiments, the principles described herein may take a conventional system and allow for a unique interaction with a projected image. The interactive
image projection system 200 may interface with a conventional image generation system (e.g., a graphic processor, etc.) to enable interaction with an image that has been projected. The interactiveimage projection system 200 may receive an image generated by the conventional image generation system, with theimage preparation mechanism 211 conducting any processing of any interaction by a user with the projected image. The conventional image generation system may generate the image in the same manner as if the image were just to be displayed by a conventional display or projector. Once a user has interacted with a projected image and such interaction has been detected and processed by theimage preparation mechanism 211, the conventional image generation system receives commands from theimage preparation mechanism 211 as it is accustomed to receive commands from conventional input devices to effect a change in the game state of a program or an application for which a GUI has been displayed (i.e., projected) and advance use of the program or the application. The conventional image generation system may operate in the same manner it would normally function in response to conventional inputs, no matter how complex the systems used to generate the commands. Whether the input was generated by a conventional hand-held controller, or through the complexity of theinput channel 220, the conventional image generation system will operate in its intended manner. - In addition to being capable of preparing input information for conventional image generation systems, the
input channel 220 may provide information for other surrounding devices, such as, any of one or more conventional input devices, and perhaps a central display, associated with the conventional image generation system, thereby altering state of any of these devices, and allowing for these devices to participate in interacting with the program or the application whose outputs are being interactively projected. -
FIG. 4 abstractly illustrates animage generation system 400, which may be used to generate theinput image 201 ofFIG. 2 . In one embodiment, theimage generation system 400 may be a conventional video game that outputs an image that might, for example, change as a player progresses through the video game. However, one, some, and perhaps even all of the functions described as being included within theimage generation system 400 may be performed instead within the interactiveimage projection system 101. - The
image generation system 400 includeslogic 411, animage generation mechanism 412, and aninput interface 413. Thelogic 411 and/or theimage generation mechanism 412 control a virtual environment space. Theimage generation mechanism 412 generates an image that is appropriate given acurrent state 414 of thelogic 411 and, thus, of the virtual environment space. Theinput interface 413 receives commands that may alter thestate 414 of virtual environment space and, thus, of thelogic 411, thereby potentially affecting the image generated by theimage generation mechanism 412. Thestate 414 may even be altered from one stage to the next as one or more users interact with a program or an application through theinput interface 413. In such systems, images can be generated at such a rate that continuous motion is perceived. There may be a bi-directional channel of communication 1108 (FIG. 11 ) between theimage generation system 400 and the interactiveimage projection system 200. The bi-directional channel may be wired or wireless, or perhaps wired in one direction and wireless in another. Input commands are typically less data-intensive as compared to images, and thus the channel ofcommunication 1108 from the interactiveimage projection system 200 to theimage generation system 400 may be wireless. The channel ofcommunication 1108 from theimage generation system 400 to the interactiveimage projection system 200 may also be wireless provided that the bandwidth of the channel in that direction is sufficient. - The interactive
image projection system 101 and/or any associatedinput devices 102A-H may have built-in microphones to allow sound data (e.g., the player's voice, etc.) to be input into theimage generation system 400 to affect thestate 414. There may also be voice recognition capability incorporated into the interactiveimage projection system 101 and/or any associatedinput devices 102A-H to permit such sound data to be converted to more usable form. Speakers, headset ports, and earpieces may be incorporated into the interactiveimage projection system 101 and/or into anyinput devices 102A-H associated with the interactiveimage projection system 101. -
FIG. 5 abstractly illustrates an embodiment of aplayer console 500. As previously mentioned, theinput devices 102A-H ofFIG. 1 may be player consoles in the context in which the distributedsystem 100 is a game environment.FIG. 5 is an abstract illustration of aplayer console 500 showing functional components of theplayer console 500. Each player, or perhaps each team of players, may have an associated player console, each associated with the corresponding player or team. Theplayer console 500 includes aprivate display area 501 andgame logic 502 capable of rendering at least a portion a private portion ofgame state 503 associated with the player (or team). The player or team may use aninput mechanism 504 to enter control input into theplayer console 500. A transmission mechanism illustrated in the form of atransceiver 505 transmits that control input to the interactiveimage projection system 200 ofFIG. 2 and/or to theimage generation system 400 ofFIG. 4 , where the control input is used to alter thestate 414 of thelogic 411 used to generate the image. -
FIG. 6 illustrates a specific embodiment of aplayer console 600. Here, theprivate display area 601 displays the player's private information (in this case, several playing cards). Theplayer console 600 also includes abarrier 602 to prevent other players from seeing the private game state displayed on theprivate display area 601. Theprivate display area 601 may be touch-sensitive, allowing the player to interact with physical gestures on theprivate display area 601, thereby causing control information to update the rendering on theprivate display area 601, and the game states on theplayer console 600, as well as on thecentral display 101. Theprivate display area 601 may also displayvideo images - In one embodiment, at least one of the player consoles is different from the remaining player consoles 600.
FIG. 7 illustrates such a player console, which might be agame master console 700, with which a game master may interface with the private viewing area to perhaps control game state. For instance, the game master may use physical gestures on a touch-sensitive display 701 of thegame master console 700 to affect what is displayed within theimage 111. For instance, the game master might control what portions of the map are viewable in theimage 111. The game master might use thegame master console 700 to control the effect of another player's actions on the operation of the game logic. The game master might also use thegame master console 700 to create a scenario and to set up a game. -
FIG. 8 is a flowchart of amethod 800 for projecting an image and for enabling interaction with the image. Atreference 801, data representing one or more virtual objects that are spatially positioned in a virtual environment space is received. An example of such data is an image in which such virtual objects are represented. The image is then projected atreference 802 in response to the received data. The image may provide a visual representation of at least part of the virtual environment space. Atreference 803, any user interaction with the visualized represent provided by the image may be detected. In response to that user interaction, the projected image is then altered atreference 804. -
FIG. 9 illustrates an embodiment of an interactive image projection system 900 in whichmultiple modules 902A through 902E are mounted to astand 901. Eachmodule 902A through 902E includes a projector and a corresponding camera (not shown) which would be in the lower surface of eachmodule 902A through 902E. The projector projects the images downward towards a surface on which thestand 901 is situated. These projectors would each project a corresponding subimage that are each processed such that the projected image is stitched together to appear as a single image on or over the surface. The camera scans for user interaction in the area of the image that has been projected. -
FIG. 10 illustrates another embodiment of an interactiveimage projection system 1000 that includes a single projector. The interactiveimage projection system 1000 includes a housing that includes arigid base 1001 situated on a substantially horizontal surface. Aprojector 1011 is capable of projecting an image upward through a lens to acurved mirror 1012, from which the image is reflected and projected throughwindows 1013, and the projected downward onto the substantially horizontal surface on which thebase 1001 is placed. The images are generated to account for the intervening lens(es), mirror(s) 1012, and window(s) 1013 used to project the image. Four cameras (of which three 1021A through 1021C are visible inFIG. 10 ) are positioned around the upper circumference of the interactiveimage projection system 1000.Such cameras 1021A through 1021C are capable of scanning a three-dimensional space adjacent to a location to which the image is projected to detect any interaction with the image. - The various operations and structures described herein may, but need not, be implemented by way of a physical computing system. Accordingly, to conclude this description, an embodiment of a computing system will be described with respect to
FIG. 11 . Thecomputing system 1100 may be incorporated within the interactiveimage projection system 101, within one or more of theinput devices 102A-H, and/or within theimage generation system 400. - Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered to be computing systems. In this description and in the claims, the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one processor and memory capable of having thereon computer-executable instructions that may be executed by the processor(s). The memory may take any physical form and may depend on the nature and form of the computing system. A
computing system 1100 may communicate with other devices, including, but not limited to other computing systems, over anetwork environment 1110, which may include multiple computing systems. In some embodiments, components of asingle computing system 1100 may be distributed over anetwork environment 1110. - In its most basic configuration, a
computing system 1100 may include at least oneprocessor 1102 andmemory 1104. Thememory 1104 may comprise a physical system memory, which may be volatile, non-volatile, or some combination of the two. The term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If components of thecomputing system 1100 are distributed over anetwork environment 1110, theprocessor 1102,memory 1104, and/or storage capability may be distributed as well. As used herein, the term “module” or “component” can refer to software objects or routines that execute on thecomputing system 1100. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads, etc.). - In the description above, embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated
computing system 1100 that performs the act direct the operation of thecomputing system 1100 in response to having executed computer-executable instructions. An example of such an operation involves the manipulation of data. The computer-executable instructions (and the manipulated data) may be stored in thememory 1104 of thecomputing system 1100. - Embodiments within the scope of the present invention also include computer-readable media for carrying or having computer-executable instructions or data structures stored thereon. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, such computer-readable media can comprise physical storage and/or memory media such as RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other physical medium which can be used to carry or store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described herein. Rather, the specific features and acts described herein are disclosed as example forms of implementing the claims.
- The components of the
computing system 1100 may, for example, be used to provide functionality to game logic, store or remember game state, configure and communicate between devices, and operate the logic of game incorporation. Each of the player consoles may also have a computing system such ascomputing system 1100 guiding their processing needs. - The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/080,678 US20230115736A1 (en) | 2009-03-25 | 2022-12-13 | Interactive environment with virtual environment space scanning |
Applications Claiming Priority (11)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/411,289 US20100248839A1 (en) | 2009-03-25 | 2009-03-25 | Wirelessly distributed electronic circle gaming |
US12/651,947 US20110165923A1 (en) | 2010-01-04 | 2010-01-04 | Electronic circle game system |
US12/855,604 US20110256927A1 (en) | 2009-03-25 | 2010-08-12 | Projection of interactive game environment |
US13/547,626 US8808089B2 (en) | 2009-03-25 | 2012-07-12 | Projection of interactive game environment |
US14/462,750 US9550124B2 (en) | 2009-03-25 | 2014-08-19 | Projection of an interactive environment |
US15/414,617 US9971458B2 (en) | 2009-03-25 | 2017-01-24 | Projection of interactive environment |
US15/980,638 US10359888B2 (en) | 2009-03-25 | 2018-05-15 | Projected, interactive environment |
US16/519,593 US10664105B2 (en) | 2009-03-25 | 2019-07-23 | Projected, interactive environment |
US16/883,972 US10928958B2 (en) | 2009-03-25 | 2020-05-26 | Interactive environment with three-dimensional scanning |
US17/182,909 US11526238B2 (en) | 2009-03-25 | 2021-02-23 | Interactive environment with virtual environment space scanning |
US18/080,678 US20230115736A1 (en) | 2009-03-25 | 2022-12-13 | Interactive environment with virtual environment space scanning |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/182,909 Continuation US11526238B2 (en) | 2009-03-25 | 2021-02-23 | Interactive environment with virtual environment space scanning |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230115736A1 true US20230115736A1 (en) | 2023-04-13 |
Family
ID=59559652
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/414,617 Active US9971458B2 (en) | 2009-03-25 | 2017-01-24 | Projection of interactive environment |
US15/980,638 Active US10359888B2 (en) | 2009-03-25 | 2018-05-15 | Projected, interactive environment |
US16/519,593 Active US10664105B2 (en) | 2009-03-25 | 2019-07-23 | Projected, interactive environment |
US16/883,972 Active US10928958B2 (en) | 2009-03-25 | 2020-05-26 | Interactive environment with three-dimensional scanning |
US17/182,909 Active US11526238B2 (en) | 2009-03-25 | 2021-02-23 | Interactive environment with virtual environment space scanning |
US18/080,678 Pending US20230115736A1 (en) | 2009-03-25 | 2022-12-13 | Interactive environment with virtual environment space scanning |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/414,617 Active US9971458B2 (en) | 2009-03-25 | 2017-01-24 | Projection of interactive environment |
US15/980,638 Active US10359888B2 (en) | 2009-03-25 | 2018-05-15 | Projected, interactive environment |
US16/519,593 Active US10664105B2 (en) | 2009-03-25 | 2019-07-23 | Projected, interactive environment |
US16/883,972 Active US10928958B2 (en) | 2009-03-25 | 2020-05-26 | Interactive environment with three-dimensional scanning |
US17/182,909 Active US11526238B2 (en) | 2009-03-25 | 2021-02-23 | Interactive environment with virtual environment space scanning |
Country Status (1)
Country | Link |
---|---|
US (6) | US9971458B2 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9971458B2 (en) * | 2009-03-25 | 2018-05-15 | Mep Tech, Inc. | Projection of interactive environment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070132721A1 (en) * | 2005-12-09 | 2007-06-14 | Edge 3 Technologies Llc | Three-Dimensional Virtual-Touch Human-Machine Interface System and Method Therefor |
US20110205341A1 (en) * | 2010-02-23 | 2011-08-25 | Microsoft Corporation | Projectors and depth cameras for deviceless augmented reality and interaction. |
US10928958B2 (en) * | 2009-03-25 | 2021-02-23 | Mep Tech, Inc. | Interactive environment with three-dimensional scanning |
Family Cites Families (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10503395A (en) | 1994-07-28 | 1998-03-31 | スーパー ディメンション インコーポレイテッド | Computer game board |
US6281878B1 (en) | 1994-11-01 | 2001-08-28 | Stephen V. R. Montellese | Apparatus and method for inputing data |
US5844985A (en) | 1995-09-22 | 1998-12-01 | Qualcomm Incorporated | Vertically correcting antenna for portable telephone handsets |
IL121666A (en) | 1997-08-31 | 2001-03-19 | Bronfeld Joshua | Electronic dice |
US6614422B1 (en) | 1999-11-04 | 2003-09-02 | Canesta, Inc. | Method and apparatus for entering data using a virtual input device |
US6710770B2 (en) | 2000-02-11 | 2004-03-23 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US6611252B1 (en) | 2000-05-17 | 2003-08-26 | Dufaux Douglas P. | Virtual data input device |
US6650318B1 (en) | 2000-10-13 | 2003-11-18 | Vkb Inc. | Data input device |
US6832954B2 (en) | 2000-05-30 | 2004-12-21 | Namco Ltd. | Photographing game machine, photographing game processing method and information storage medium |
US8040328B2 (en) | 2000-10-11 | 2011-10-18 | Peter Smith | Books, papers, and downloaded information to facilitate human interaction with computers |
FI113094B (en) | 2000-12-15 | 2004-02-27 | Nokia Corp | An improved method and arrangement for providing a function in an electronic device and an electronic device |
US7103236B2 (en) | 2001-08-28 | 2006-09-05 | Adobe Systems Incorporated | Methods and apparatus for shifting perspective in a composite image |
US7133083B2 (en) | 2001-12-07 | 2006-11-07 | University Of Kentucky Research Foundation | Dynamic shadow removal from front projection displays |
US6997803B2 (en) | 2002-03-12 | 2006-02-14 | Igt | Virtual gaming peripherals for a gaming machine |
US7334791B2 (en) | 2002-08-24 | 2008-02-26 | Blinky Bones, Inc. | Electronic die |
JP4230999B2 (en) | 2002-11-05 | 2009-02-25 | ディズニー エンタープライゼス インコーポレイテッド | Video-operated interactive environment |
US7884804B2 (en) | 2003-04-30 | 2011-02-08 | Microsoft Corporation | Keyboard with input-sensitive display device |
AU2004272018B2 (en) | 2003-09-05 | 2010-09-02 | Bally Gaming International, Inc. | Systems, methods, and devices for monitoring card games, such as baccarat |
US6955297B2 (en) | 2004-02-12 | 2005-10-18 | Grant Isaac W | Coordinate designation interface |
JP3904562B2 (en) | 2004-02-18 | 2007-04-11 | 株式会社ソニー・コンピュータエンタテインメント | Image display system, recording medium, and program |
US7204428B2 (en) | 2004-03-31 | 2007-04-17 | Microsoft Corporation | Identification of object on interactive display surface by identifying coded pattern |
US7095033B2 (en) | 2004-04-27 | 2006-08-22 | Nicholas Sorge | Multi-sided die with authenticating characteristics and method for authenticating same |
US7394459B2 (en) | 2004-04-29 | 2008-07-01 | Microsoft Corporation | Interaction between objects and a virtual environment display |
US7399086B2 (en) | 2004-09-09 | 2008-07-15 | Jan Huewel | Image processing method and image processing device |
JP4489555B2 (en) | 2004-10-15 | 2010-06-23 | ビーエルデーオリエンタル株式会社 | Bowling game machine |
ES2308553T3 (en) | 2004-10-25 | 2008-12-01 | Koninklijke Philips Electronics N.V. | AUTONOMOUS WIRELESS DICE. |
US7727060B2 (en) | 2005-07-15 | 2010-06-01 | Maurice Mills | Land-based, on-line poker system |
US7911444B2 (en) | 2005-08-31 | 2011-03-22 | Microsoft Corporation | Input method for surface of interactive display |
US7599561B2 (en) | 2006-02-28 | 2009-10-06 | Microsoft Corporation | Compact interactive tabletop with projection-vision |
WO2007107874A2 (en) | 2006-03-22 | 2007-09-27 | Home Focus Development Ltd | Interactive playmat |
US8666366B2 (en) | 2007-06-22 | 2014-03-04 | Apple Inc. | Device activation and access |
US20080280682A1 (en) | 2007-05-08 | 2008-11-13 | Brunner Kevin P | Gaming system having a set of modular game units |
US20080278894A1 (en) | 2007-05-11 | 2008-11-13 | Miradia Inc. | Docking station for projection display applications |
US20090020947A1 (en) | 2007-07-17 | 2009-01-22 | Albers John H | Eight piece dissection puzzle |
US20090029754A1 (en) | 2007-07-23 | 2009-01-29 | Cybersports, Inc | Tracking and Interactive Simulation of Real Sports Equipment |
US20090124382A1 (en) | 2007-11-13 | 2009-05-14 | David Lachance | Interactive image projection system and method |
US8007110B2 (en) | 2007-12-28 | 2011-08-30 | Motorola Mobility, Inc. | Projector system employing depth perception to detect speaker position and gestures |
US8267524B2 (en) | 2008-01-18 | 2012-09-18 | Seiko Epson Corporation | Projection system and projector with widened projection of light for projection onto a close object |
US8167700B2 (en) | 2008-04-16 | 2012-05-01 | Universal Entertainment Corporation | Gaming device |
JP6043482B2 (en) | 2008-06-03 | 2016-12-14 | トウィードルテック リミテッド ライアビリティ カンパニー | Intelligent board game system, game piece, how to operate intelligent board game system, how to play intelligent board game |
US7967451B2 (en) | 2008-06-27 | 2011-06-28 | Microsoft Corporation | Multi-directional image displaying device |
JP5338166B2 (en) | 2008-07-16 | 2013-11-13 | ソニー株式会社 | Transmitting apparatus, stereoscopic image data transmitting method, receiving apparatus, and stereoscopic image data receiving method |
US9218116B2 (en) | 2008-07-25 | 2015-12-22 | Hrvoje Benko | Touch interaction with a curved display |
US20100035684A1 (en) | 2008-08-08 | 2010-02-11 | Bay Tek Games, Inc. | System and method for controlling movement of a plurality of game objects along a playfield |
US8226476B2 (en) | 2008-11-04 | 2012-07-24 | Quado Media Inc. | Multi-player, multi-screens, electronic gaming platform and system |
JP5282617B2 (en) | 2009-03-23 | 2013-09-04 | ソニー株式会社 | Information processing apparatus, information processing method, and information processing program |
US20110165923A1 (en) | 2010-01-04 | 2011-07-07 | Davis Mark L | Electronic circle game system |
US20110256927A1 (en) | 2009-03-25 | 2011-10-20 | MEP Games Inc. | Projection of interactive game environment |
US20100248839A1 (en) | 2009-03-25 | 2010-09-30 | MEP Games Inc. | Wirelessly distributed electronic circle gaming |
US8246467B2 (en) | 2009-04-29 | 2012-08-21 | Apple Inc. | Interactive gaming with co-located, networked direction and location aware devices |
US20100285881A1 (en) | 2009-05-07 | 2010-11-11 | Microsoft Corporation | Touch gesturing on multi-player game space |
JP5273478B2 (en) | 2009-07-07 | 2013-08-28 | ソニー株式会社 | Video display device and video display system |
US8421634B2 (en) | 2009-12-04 | 2013-04-16 | Microsoft Corporation | Sensing mechanical energy to appropriate the body for data input |
CN101776836B (en) | 2009-12-28 | 2013-08-07 | 武汉全真光电科技有限公司 | Projection display system and desktop computer |
US8491135B2 (en) | 2010-01-04 | 2013-07-23 | Microvision, Inc. | Interactive projection with gesture recognition |
US8751049B2 (en) | 2010-05-24 | 2014-06-10 | Massachusetts Institute Of Technology | Kinetic input/output |
US8388146B2 (en) | 2010-08-01 | 2013-03-05 | T-Mobile Usa, Inc. | Anamorphic projection device |
US20120223885A1 (en) | 2011-03-02 | 2012-09-06 | Microsoft Corporation | Immersive display experience |
US20130113975A1 (en) | 2011-11-04 | 2013-05-09 | Peter Gabris | Projector Image Correction Method and System |
US9316889B2 (en) | 2012-08-07 | 2016-04-19 | Nook Digital, Llc | Front projection eReader system |
US8933974B1 (en) | 2012-09-25 | 2015-01-13 | Rawles Llc | Dynamic accommodation of display medium tilt |
-
2017
- 2017-01-24 US US15/414,617 patent/US9971458B2/en active Active
-
2018
- 2018-05-15 US US15/980,638 patent/US10359888B2/en active Active
-
2019
- 2019-07-23 US US16/519,593 patent/US10664105B2/en active Active
-
2020
- 2020-05-26 US US16/883,972 patent/US10928958B2/en active Active
-
2021
- 2021-02-23 US US17/182,909 patent/US11526238B2/en active Active
-
2022
- 2022-12-13 US US18/080,678 patent/US20230115736A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070132721A1 (en) * | 2005-12-09 | 2007-06-14 | Edge 3 Technologies Llc | Three-Dimensional Virtual-Touch Human-Machine Interface System and Method Therefor |
US10928958B2 (en) * | 2009-03-25 | 2021-02-23 | Mep Tech, Inc. | Interactive environment with three-dimensional scanning |
US11526238B2 (en) * | 2009-03-25 | 2022-12-13 | Mep Tech, Inc. | Interactive environment with virtual environment space scanning |
US20110205341A1 (en) * | 2010-02-23 | 2011-08-25 | Microsoft Corporation | Projectors and depth cameras for deviceless augmented reality and interaction. |
Also Published As
Publication number | Publication date |
---|---|
US20170235430A1 (en) | 2017-08-17 |
US20180260078A1 (en) | 2018-09-13 |
US10928958B2 (en) | 2021-02-23 |
US10359888B2 (en) | 2019-07-23 |
US9971458B2 (en) | 2018-05-15 |
US20190346968A1 (en) | 2019-11-14 |
US11526238B2 (en) | 2022-12-13 |
US10664105B2 (en) | 2020-05-26 |
US20200285346A1 (en) | 2020-09-10 |
US20210255728A1 (en) | 2021-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9550124B2 (en) | Projection of an interactive environment | |
US10596478B2 (en) | Head-mounted display for navigating a virtual environment | |
US9656168B1 (en) | Head-mounted display for navigating a virtual environment | |
KR100963238B1 (en) | Tabletop-Mobile augmented reality systems for individualization and co-working and Interacting methods using augmented reality | |
Stavness et al. | pCubee: a perspective-corrected handheld cubic display | |
US20220168660A1 (en) | Hands-free audio control device | |
JP2012161604A (en) | Spatially-correlated multi-display human-machine interface | |
US20230115736A1 (en) | Interactive environment with virtual environment space scanning | |
KR20200083762A (en) | A hologram-projection electronic board based on motion recognition | |
US11969666B2 (en) | Head-mounted display for navigating virtual and augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
AS | Assignment |
Owner name: MEP TECH, INC., UTAH Free format text: CHANGE OF NAME;ASSIGNOR:M.E.P. GAMES INC.;REEL/FRAME:063988/0585 Effective date: 20130122 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |