WO2008108965A1 - Virtual world user opinion & response monitoring - Google Patents
Virtual world user opinion & response monitoring Download PDFInfo
- Publication number
- WO2008108965A1 WO2008108965A1 PCT/US2008/002630 US2008002630W WO2008108965A1 WO 2008108965 A1 WO2008108965 A1 WO 2008108965A1 US 2008002630 W US2008002630 W US 2008002630W WO 2008108965 A1 WO2008108965 A1 WO 2008108965A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- animated
- environment
- avatar
- virtual
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
-
- A63F13/12—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/23—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/61—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor using advertising information
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
- A63F13/795—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for finding other players; for building a team; for providing a buddy list
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/105—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1081—Input via voice recognition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/40—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
- A63F2300/406—Transmission via wireless network, e.g. pager or GSM
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/40—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
- A63F2300/407—Data transfer via internet
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/50—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
- A63F2300/55—Details of game data or player data management
- A63F2300/5506—Details of game data or player data management using advertisements
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6063—Methods for processing data by generating or executing the game program for sound processing
- A63F2300/6072—Methods for processing data by generating or executing the game program for sound processing of an input signal, e.g. pitch and rhythm extraction, voice recognition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/807—Role playing or strategy games
Definitions
- Example gaming platforms may be the Sony Playstation or Sony Playstation2 (PS2), each of which is sold in the form of a game console.
- the game console is designed to connect to a monitor (usually a television) and enable user interaction through handheld controllers.
- the game console is designed with specialized processing hardware, including a CPU, a graphics synthesizer for processing intensive graphics operations, a vector unit for performing geometry transformations, and other glue hardware, firmware, and software.
- the game console is further designed with an optical disc tray for receiving game compact discs for local play through the game console. Online gaming is also possible, where a user can interactively play against or with other users over the Internet.
- a virtual world is a simulated environment in which users may interact with each other via one or more computer processors. Users may appear on a video screen in the form of representations referred to as avatars.
- the degree of interaction between the avatars and the simulated environment is implemented by one or more computer applications that govern such interactions as simulated physics, exchange of information between users, and the like.
- the nature of interactions among users of the virtual world is often limited by the constraints of the system implementing the virtual world.
- the present invention fills these needs by providing computer generated graphics that depict a virtual world.
- the virtual world can be traveled, visited, and interacted with using a controller or controlling input of a real-world computer user.
- the real-world user in essence is playing a video game, in which he controls an avatar (e.g., virtual person) in the virtual environment.
- the real-world user can move the avatar, strike up conversations with other avatars, view content such as advertising, make comments or gestures about content or advertising.
- method for enabling interactive control and monitoring of avatars in a computer generated virtual world environment includes generating a graphical animated environment and presenting a viewable object within the graphical animated environment. Further provided is moving an avatar within the graphical animated environment, where the moving includes directing a field of view of the avatar toward the viewable object. A response of the avatar is detected when the field of view of the avatar is directed toward the viewable object. Further included is storing the response and the response is analyzed to determine whether the response by the avatar is more positive or more negative.
- the viewable object is an advertisement.
- a computer implemented method for executing a network application is provided.
- the network application is defined to render a virtual environment, and the virtual environment is depicted by computer graphics.
- the method includes generating an animated user and controlling the animated user in the virtual environment.
- the method presents advertising objects in the virtual environment and detects actions by the animated user to determine if the animated user is viewing one of the advertising object in the virtual environment. Reactions of the animated user are captured when the animated user is viewing the advertising object.
- the reactions by the animated user within the virtual environment are those that relate to the advertising object, and are presented to a third party to determine effectiveness of the advertising object in the virtual environment.
- a computer implemented method for executing a network application is provided.
- the network application is defined to render a virtual environment, and the virtual environment is depicted by computer graphics.
- the method includes generating an animated user and controlling the animated user in the virtual environment.
- the method presents advertising objects in the virtual environment and detects actions by the animated user to determine if the animated user is viewing one of the advertising object in the virtual environment. Reactions of the animated user are captured when the animated user is viewing the advertising object.
- the reactions by the animated user within the virtual environment are those that relate to the advertising object, and are presented to a third party to determine if the reactions are more positive or more negative.
- Figure IA illustrates a virtual space, in accordance with one embodiment of the present invention.
- Figure IB illustrates user sitting on a chair holding a controller and communicating with a game console, in accordance with one embodiment of the present invention.
- Figures 1C-1D illustrate a location profile for an avatar that is associated with a user of a game in which virtual space interactivity is provided, in accordance with one embodiment of the present invention.
- Figure 2 illustrates a more detailed diagram showing the monitoring of the real- world user for generating feedback, as described with reference to in Figure IA, in accordance with one embodiment of the present invention.
- Figure 3 A illustrates an example where controller and the buttons of the controller can be selected by a real-world user to cause the avatar's response to change, depending on the real-world user's approval or disapproval of advertisement, in accordance with one embodiment of the present invention.
- Figure 3B illustrates operations that may be performed by a program in response to user operation of a controller for generating button responses of the avatar when viewing specific advertisements or objects, or things within the virtual space.
- Figures 4A-4C illustrate other controller buttons that may be selected from the left shoulder buttons and the right shoulder buttons to cause different selections that will express different feedback from an avatar, in accordance with one embodiment of the present invention.
- Figure 5 illustrates an embodiment where a virtual space may include a plurality of virtual world avatars, in accordance with one embodiment of the present invention.
- Figure 6 illustrates a flowchart diagram used to determine when to monitor user feedback, in accordance with one embodiment of the present invention.
- Figure 7 A illustrates an example where user A is walking through the virtual space and is entering an active area, in accordance with one embodiment of the present invention.
- Figure 7B shows user A entering the active area, but having the field of view not focused on the screen, in accordance with one embodiment of the present invention.
- Figure 7C illustrates an example where user A is now focused in on a portion of the screen, in accordance with one embodiment of the present invention.
- Figure 7D illustrates an example where user is fully viewing the screen and is within the active area, in accordance with one embodiment of the present invention.
- Figure 8A illustrates an embodiment where users within a virtual room may be prompted to vote as to their likes or dislikes, regarding a specific ad, in accordance with one embodiment of the present invention.
- Figure 8B shows users moving to different parts of the room to signal their approval or disapproval, or likes or dislikes, in accordance with one embodiment of the present invention.
- Figure 8C shows an example of users voting YES or NO by raising their left or right hands, in accordance with one embodiment of the present invention.
- Figure 9 schematically illustrates the overall system architecture of the Sony® Playstation 3® entertainment device, a console having controllers for implementing an avatar control system in accordance with one embodiment of the present invention.
- Figure 10 is a schematic of the Cell processor in accordance with one embodiment of the present invention.
- Embodiments of computer generated graphics that depict a virtual world are provided.
- the virtual world can be traveled, visited, and interacted with using a controller or controlling input of a real-world computer user.
- the real- world user in essence is playing a video game, in which he controls an avatar (e.g., virtual person) in the virtual environment.
- the real-world user can move the avatar, strike up conversations with other avatars, view content such as advertising, make comments or gestures about content or advertising.
- program instructions and processing is performed to monitor any comments, gestures, or interactions with object of the virtual world.
- monitoring is performed upon obtaining permission from users, so that users have control of whether their actions are tracked.
- the user's experience in the virtual world may be enhanced, as the display and rendering of data for the user is more tailored to the users likes and dislikes.
- advertisers will learn what specific users like, and their advertising can be adjusted for specific users or for types of users (e.g., teenagers, young adults, kids (using kid-rated environments), adults, and other demographics, types or classes).
- the information of user response to specific ads can also be provided directly to advertisers, game developers, logic engines, and suggestion engines. In this manner, advertisers will have a better handle on the customer likes, dislikes, and may be better suited to provide types of ads to specific users, and game/environment developers and owners can apply correct charges to advertisers based on use by users, selection by users, activity by users, reaction by users, viewing by users, etc.
- game/environment developers and owners can apply correct charges to advertisers based on use by users, selection by users, activity by users, reaction by users, viewing by users, etc.
- users may interact with a virtual world.
- virtual world means a representation of a real or fictitious environment having rules of interaction simulated by means of one or more processors that a real user may perceive via one or more display devices and/or may interact with via one or more user interfaces.
- user interface refers to a real device by which a user may send inputs to or receive outputs from the virtual world.
- the virtual world may be simulated by one or more processor modules. Multiple processor modules may be linked together via a network.
- the user may interact with the virtual world via a user interface device that can communicate with the processor modules and other user interface devices via a network.
- Certain aspects of the virtual world may be presented to the user in graphical form on a graphical display such as a computer monitor, television monitor or similar display. Certain other aspects of the virtual world may be presented to the user in audible form on a speaker, which may be associated with the graphical display.
- users may be represented by avatars. Each avatar within the virtual world may be uniquely associated with a different user.
- the name or pseudonym of a user may be displayed next to the avatar so that users may readily identify each other.
- a particular user's interactions with the virtual world may be represented by one or more corresponding actions of the avatar.
- Different users may interact with each other in the public space via their avatars.
- An avatar representing a user could have an appearance similar to that of a person, an animal or an object.
- An avatar in the form of a person may have the same gender as the user or a different gender. The avatar may be shown on the display so that the user can see the avatar along with other objects in the virtual world.
- the display may show the world from the point of view of the avatar without showing itself.
- the user's (or avatar's) perspective on the virtual world may be thought of as being the view of a virtual camera.
- a virtual camera refers to a point of view within the virtual world that may be used for rendering two-dimensional images of a 3D scene within the virtual world.
- Users may interact with each other through their avatars by means of the chat channels associated with each lobby.
- Users may enter text for chat with other users via their user interface.
- the text may then appear over or next to the user's avatar, e.g., in the form of comic-book style dialogue bubbles, sometimes referred to as chat bubbles.
- chat may be facilitated by the use of a canned phrase chat system sometimes referred to as quick chat.
- quick chat a user may select one or more chat phrases from a menu.
- the public spaces are public in the sense that they are not uniquely associated with any particular user or group of users and no user or group of users can exclude another user from the public space.
- Each private space by contrast, is associated with a particular user from among a plurality of users.
- a private space is private in the sense that the particular user associated with the private space may restrict access to the private space by other users.
- the private spaces may take on the appearance of familiar private real estate.
- Moving the avatar representation of user A 102b about the conceptual virtual space can be dictated by a user moving a controller of a game console and dictating movements of the avatar in different directions so as to virtually enter the various spaces of the conceptual virtual space.
- FIG. 1A illustrates a virtual space 100, in accordance with one embodiment of the present invention.
- the virtual space 100 is one where avatars are able to roam, congregate, and interact with one another and/or objects of a virtual space.
- Avatars are virtual world animated characters that represent or are correlated to a real-world user which may be playing an interactive game.
- the interactive game may require the real-world user to move his or her avatar about the virtual space 100 so as to enable interaction with other avatars (controlled by other real-world users or computer generated avatars), that may be present in selected spaces within the virtual space 100.
- the virtual space 100 shown in Figure IA shows the avatars user A 102b and user B 104b.
- User A 102b is shown having a user field of view 102b' while user B 104b is shown having a user B field of view 104b'.
- user A and user B are in the virtual space 100, and focused on an advertisement 101.
- Advertisement 101 may include a model person that is holding up a particular advertising item (e.g., product, item, object, or other image of a product or service), and is displaying the advertising object in an animated video, still picture, or combinations thereof.
- advertisement 101 may portray a sexy model for model 101a, so as to attract users that may be roaming or traveling across a virtual space 100.
- Other techniques for attracting avatar users, as is commonly done in advertising may also be included as part of this embodiment.
- model 101a could be animated and could move about a screen within the virtual space 100 or can jump out of the virtual screen and join the avatars.
- user A and user B are shown tracking their viewing toward this particular advertisement 101.
- the field of view of each of the users can be tracked by a program executed by a computing system so as to determine where within the virtual space 100 the users are viewing, and/or what duration, what their gestures might be while viewing the advertisement 101, etc..
- Operation 106 is shown where processing is performed to determine whether users (e.g., avatars), are viewing advertisements via the avatars that define user A 102b and user B 104b.
- 108 illustrates processing performed to detect and monitor real- world user feedback 108a and to monitor user controlled avatar feedback 108b.
- real-world users can select specific keys on controllers so as to graphically illustrate their approval or disapproval in a graphical form using user-prompted thumbs up or user-prompted thumbs down 108c.
- the real-world response of a real-world user 102a playing a game can be monitored in a number of ways.
- the real- world user may be holding a controller while viewing a display screen.
- the display screen may provide two views.
- One view may be from the standpoint of the user avatar and the avatar's field of view, while the other view may be from the perspective of the avatar, as if the real-world user were floating behind the avatar (such as the view of the avatar from the avatar's backside).
- the real- world user frowns, becomes excited makes facial expressions, these gestures, comments and/or facial expressions may be tracked by a camera that is part of a real-world system.
- the real- world system may be connected to a computing device (e.g., such as a game console, or general computer(s)), and a camera that is interfaced with the game console.
- a computing device e.g., such as a game console, or general computer(s)
- the camera in the real-world environment will track the real-world user's 102a facial expressions, sounds, frowns, or general excitement or non-excitement during the experience.
- the experience may be that of the avatar that is moving about the virtual space 100 and as viewed by the user in the real- world.
- a process may be executed to collect real- world and avatar response. If the real-world user 102a makes a gesture that is recognized by the camera, those gestures will be mapped to the face of the avatar. Consequently, real- world user facial expressions, movements, and actions, if tracked, can be interpreted, and assigned to particular aspects of the avatar. If the real-world user laughs, the avatar laughs, if the real- world user jumps, the avatar jumps, if the real-world user gets angry, the avatar gets angry, if the real-world user moves a body part, the avatar moves a body part. This, in this embodiment, is not necessary for the user to interface with a controller, but the real-world user, by simply moving, reacting, etc., can cause the avatar to do the same as the avatar moves about the virtual spaces.
- the monitoring may be performed of user-controlled avatar feedback 108b.
- the real-world user can decide to select certain buttons on the controller to cause the avatar to display a response.
- the real-world user may select a button to cause the avatar to laugh, frown, look disgusted, or generally produce a facial and/or body response.
- this information can be fed back for analysis in operation 110.
- users will be asked to approve monitoring of their response, and if monitored, their experience may be enhanced, as the program and computing system(s) can provide an environment that shapes itself to the likes, dislikes, etc. of the specific users or types of users.
- the analysis is performed by a computing system(s) (e.g., networked, local or over the internet), and controlled by software that is capable of determining the button(s) selected by the user and the visual avatar responses, or the responses captured by the camera or microphone of the user in the real- world.
- a computing system(s) e.g., networked, local or over the internet
- software that is capable of determining the button(s) selected by the user and the visual avatar responses, or the responses captured by the camera or microphone of the user in the real- world.
- Operation 112 is an optional operation that allows profile analysis to be accessed by the computing system in addition to the analysis obtained from the feedback in operation 110.
- Profile analysis 112 is an operation that allows the computing system to determine pre-defined likes, dislikes, geographic locations, languages, and other attributes of a particular user that may be visiting a virtual space 100. In this manner, in addition to monitoring what the avatars are looking at, their reaction and feedback, this additional information can be profiled and stored in a database so that data mining can be done and associated with the specific avatars viewing the content.
- Figure IB illustrates user 102a sitting on a chair holding a controller and communicating with a game console, in accordance with one embodiment of the present invention.
- User 102a being the real-world user, will have the option of viewing his monitor and the images displayed on the monitor, in two modes.
- One mode may be from the eyes of the real-world user, showing the backside of user A 102b, which is the avatar.
- the user A 102b avatar has a field of view 102b 1 while user B has a filed of view 104b'.
- the view will be a closer-up view showing what is within the field of view 102b' of the avatar 102b.
- the screen 101 will be a magnification of the model 101a which more clearly shows the view from the eyes of the avatar controlled by the real-world user.
- the real-world user can then switch between mode A (from behind the avatar), or mode B (from the eyes of the virtual user avatar).
- mode A from behind the avatar
- mode B from the eyes of the virtual user avatar
- the gestures of the avatar as controlled by the real-world user will be tracked as well as the field of view and position of the eyes/head of the avatar within the virtual space 100.
- Figure 1C illustrates a location profile for an avatar that is associated with a user of a game in which virtual space interactivity is provided.
- a selection menu may be provided to allow the user to select a profile that will better define the user's interests and the types of locations and spaces that may be available to the user.
- the user may be provided with a location menu 116.
- Location menu 116 may be provided with a directory of countries that may be itemized by alphabetical order.
- the user would then select a particular country, such as Japan, and the user would then be provided a location sub-menu 118.
- Location sub-menu 118 may ask the user to define a state 118a, a province 1 18b, a region 118c, or a prefecture 118d, depending on the location selected. If the country that was selected was Japan, Japan is divided into prefectures 118d, that represent a type of state within the country of Japan. Then, the user would be provided with a selection of cities 120. [0053] Once the user has selected a particular city within a prefecture, such as Tokyo, Japan, the user would be provided with further menus to zero down into locations and virtual spaces that may be applicable to the user.
- Figure ID illustrates a personal profile for the user and the avatar that would be representing the user in the virtual space.
- a personal profile menu 122 is provided.
- the personal profile menu 122 will list a plurality of options for the user to select based on the types of social definitions associated with the personal profile defined by the user.
- the social profile may include sports teams, sports e-play, entertainment, and other sub-categories within the social selection criteria.
- a sub-menu 124 may be selected when a user selects a professional men's sports team, and additional sub-menus 126 that may define further aspects of motor sports.
- the examples illustrated in the personal profile menu 122 are only exemplary, and it should be understood that the granularity and that variations in profile selection menu contents may change depending on the country selected for the user using the location menu 116 of Figure 1C, the sub-menus 118, and the city selector 120. In one embodiment, certain categories may be partially or completely filled based on the location profile defined by the user. For example, the Japanese location selection could load a plurality of baseball teams in the sports section that may include Japanese league teams (e.g., Nippon Baseball League) as opposed to U.S. based Major League Baseball (MLBTM) teams.
- JEBTM Major League Baseball
- the personal profile menu 122 is a dynamic menu that is generated and is displayed to the user with specific reference to the selections of the user in relation to the where the user is located on the planet.
- FIG. 2 illustrates a more detailed diagram showing the monitoring of the real- world user for generating feedback, as described with reference to 108a in Figure IA.
- the real-world user 102a may be sitting on a chair holding a controller 208.
- the controller 208 can be wireless or wired to a computing device.
- the computing device can be a game console 200 or a general computer.
- the console 200 is capable of connecting to a network.
- the network may be a wide-area-network (or internet) which would allow some or all processing to be performed by a program running over the network.
- one or more servers will execute a game program that will render objects, users, animation, sounds, shading, textures, and other user interfaced reactions, or captures based on processing as performed on networked computers.
- user 102a holds controller 208 which movements of the controller buttons are captured in operation 214. The movement of the arms and hands and buttons are captured, so as to capture motion of the controller 208, buttons of the controller 208, and six-axis dimensional rotation of the controller 208. Example six-axis positional monitoring may be done using an inertial monitor.
- controller 208 may be captured in terms of sound by a microphone 202, or by position, lighting, or other input feedback by a camera 204.
- Display 206 will render a display showing the virtual user avatar traversing the virtual world and the scenes of the virtual world, as controlled by user 102a.
- Operation 212 is configured to capture the sound processed for particular words by user 102a.
- the microphone 202 will be configured to capture that information so that the sound may be processed for identifying particular words or information. Voice recognition may also be performed to determine what is said in particular spaces, if users authorize capture.
- camera 204 will capture the gestures by the user 102a, movements by controller 208, facial expressions by user 102a, and general feedback excitement or non-excitement.
- Camera 204 will then provide capture of facial expressions, body language, and other information in operation 210. All of this information that is captured in operations, 210, 212, and 214, can be provided as feedback for analysis 110, as described with reference to Figure IA.
- the aggregated user opinion is processed in operation 218 that will organize and compartmentalize the aggregated responses by all users that may be traveling the virtual space and viewing the various advertisements within the virtual spaces that they enter. This information can then be parsed and provided in operation 220 to advertisers and operators of the virtual space.
- This information can be provide guidance to the advertisers as to who is viewing the advertisements, how long they viewed the advertisements, their gestures made in front of the advertisements, their gestures made about the advertisements, and will also provide operators of the virtual space metric by which to possibly charge for the advertisements within the virtual spaces, as depending on their popularity, views by particular users, and the like.
- FIG 3 A illustrates an example where controller 208 and the buttons of the controller 208 can be selected by a real-world user to cause the avatar's response 300 to change, depending on the real- world user's approval or disapproval of advertisement 100.
- user controlled avatar feedback is monitored, depending on whether the avatar is viewing the advertisement 100 and the specific buttons selected on controller 208 when the avatar is focusing in on the advertisement 100.
- the real- world user that would be using controller 208 (not shown), could select Rl so that the avatar response 300 of the user's avatar is a laugh (HA HA HA!).
- buttons such as left shoulder buttons 208a and right shoulder buttons 208b can be used for similar controls of the avatar.
- the user can select L2 to smile, Ll to frown, R2 to roll eyes, and other buttons for producing other avatar responses 300.
- Figure 3B illustrates operations that may be performed by a program in response to user operation of a controller 208 for generating button responses of the avatar when viewing specific advertisements 100 or objects, or things within the virtual space.
- Operation 302 defines buttons that are mapped to avatar facial response to enable players to modify avatar expressions.
- Operation 304 defines detection of when a player (or a user) views a specific ad and the computing device would recognize that the user is viewing that specific ad.
- the avatar will have a field of view, and that field of view can be monitored, depending on where the avatar is looking within the virtual space.
- the computing system (and computer program controlling the computer system) is constantly monitoring the field of view of the avatars within the virtual space, it is possible to determine when the avatar is viewing specific ads. If a user is viewing a specific ad that should be monitored for facial responses, operation 306 will define monitoring of the avatar for changes in avatar facial expressions. If the user selects for the avatar to laugh at specific ads, this information will be taken in by the computing system and stored to define how specific avatars responded to specific ads.
- operations 307 and 308 will be continuously analyzed to determine all changes and facial expressions, and the information is fed back for analysis.
- the analysis performed on all the facial expressions can be off-line or in real time.
- This information can then be passed back to the system and then populated to specific advertisers to enable data mining, and re-tailoring of specific ads in response to their performance in the virtual space.
- Figure 4A illustrates other controller 208 buttons that may be selected from the left shoulder buttons 208a and the right shoulder buttons 208b to cause different selections that will express different feedback from an avatar. For instance, a user can scroll through the various sayings and then select the sayings that they desire by pushing button Rl . The user can also select through different emoticons to select a specific emoticon and then select button R2.
- FIG 4B illustrates the avatar controlled by a real-world user which then selects the saying "COOL” and then selecting button Rl.
- the real- world user can also select button R2 in addition to Rl to provide an emoticon together with a saying.
- the result is the avatar will smile and say "COOL”.
- the avatar saying "cool” can be displayed using a cloud or it could be output by the computer by a sound output.
- Figure 4C illustrates avatar 400a where the real-world user selected button Rl to select "MEH” plus Ll to illustrate a hand gesture.
- the result will be avatar 400b and 400c where the avatar is saying "MEH” in a cloud and is moving his hand to signal a MEH expression.
- the expression of MEH is an expression of indifference or lack of interest.
- the avatar can signal with a MEH and a hand gesture.
- a MEH a hand gesture.
- Each of these expressions, whether they are sayings, emoticons, gestures, animations, and the like, are tracked if the user is viewing a specific advertisement and such information is captured so that the data can be provided to advertisers or the virtual world creators and/or operators.
- FIG. 5 illustrates an embodiment where a virtual space 500 may include a plurality of virtual world avatars.
- Each of the virtual world avatars will have their own specific field of view, and what they are viewing is tracked by the system. If the avatars are shown having discussions amongst themselves, that information is tracked to show that they are not viewing a specific advertisement, object, picture, trailer, or digital data that may be presented within the virtual space.
- a plurality of avatars are shown viewing a motion picture within a theater. Some avatars are not viewing the picture and thus, would not be tracked to determine their facial expressions. Users controlling their avatars can then move about the space and enter into locations where they may or may not be viewing a specific advertisement. Consequently, the viewer's motions, travels, field of views, and interactions can be monitored to determine whether the users are actively viewing advertisements, objects, or interacting with one another.
- FIG. 6 illustrates a flowchart diagram used to determine when to monitor user feedback.
- an active area needs to be defined.
- the active area can be defined as an area where an avatar feedback is monitored. Active area can be sized based on an advertisement size, ad placement, active areas can be overlapped, and the like. Once an active area is monitored for the field of view of the avatar, the viewing by the avatar can be logged as to how long the avatar views the advertisement, spends in a particular area in front of an advertisement, and the gestures made by the avatar when viewing the advertisement.
- Operation 600 shows a decision where avatar users are determined to be in or out of an active area. If the users are not in the active area, then operation 602 is performed where nothing is tracked of the avatar.
- that avatar user is tracked to determine if the user's field of view if focusing in on an advertisement within the space in operation 604. If the user is not focusing on any advertisement or object that should be tracked in operation 604, then the method moves to operation 608.
- operation 608 the system will continue monitoring the field of view of the user.
- the method will continuously move to operation 610 where it is determined whether the user's field of view is now on the ad. If it is not, the method moves back to operation 608. This loop will continue until, in operation 610 it is determined that the user is viewing the ad, and the user is within the active area. At that point, operation 606 will monitor feedback capture of the avatar when the avatar is within the active area, and the avatar has his or her field of view focused on the ad.
- FIG. 7A illustrates an example where user A 704 is walking through the virtual space and is entering an active area 700.
- Active area 700 may be a specific room, a location within a room, or a specific space within the virtual space.
- user A 704 is walking in a direction of the active area 700 where three avatar users are viewing a screen or ad 702.
- the three users already viewing the screen 702 will attract others because they are already in the active area 700, and their field of view is focused on the screen that may be running an interesting ad for a service or product.
- Figure 7B shows user A 704 entering the active area 700, but having the field of view 706 not focused on the screen 702.
- the system will not monitor the facial expressions or bodily expressions, or verbal expressions by the avatar 704, because the avatar 704 is not focused in on the screen 702 where an ad may be running and user feedback of his expressions would be desired.
- Figure 7C illustrates an example where user A 704 is now focused in on a portion of the screen 710.
- the field of view 706 shows the user A's 704 field of view only focusing in on half of the screen 702. If the ad content is located in area 710, then the facial expressions and feedback provided by user A 704 will be captured. However, if the advertisement content is on the screen 702 in an area not covered by his field of view, that facial expression and feedback will not be monitored.
- Figure 7D illustrates an example where user 704 is fully viewing the screen 702 and is within the active area 700.
- the system will continue monitoring the feedback from user A and only discontinued feedback monitoring of user A when user A leaves the active area.
- the user A's feedback can be discontinued for monitoring feedback if the particular advertisement is shown on the screen 702 ends, or is no longer in session.
- Figure 8A illustrates an embodiment where users within a virtual room may be prompted to vote as to their likes or dislikes, regarding a specific ad 101.
- users 102b and 104b may move onto either a YES area or a NO area.
- User 102b' is now standing on YES and user 104b' is now standing on NO.
- This feedback is monitored, and is easily captured, as users can simply move to different locations within a scene to display their approval, disapproval, likes, dislikes, etc.
- Figure 8B shows users moving to different parts of the room to signal their approval or disapproval, or likes or dislikes.
- FIG. 8C shows an example of users 800-808 voting YES or NO by raising their left or right hands. These parts of the user avatar bodies can be moved by simply selecting the correct controller buttons (e.g., Ll, Rl, etc.).
- the virtual world program may be executed partially on a server connected to the internet and partially on the local computer (e.g., game console, desktop, laptop, or wireless hand held device). Still further, the execution can be entirely on a remote server or processing machine, which provides the execution results to the local display screen.
- the local display or system should have minimal processing capabilities to receive the data over the network (e.g., like the Internet) and render the graphical data on the screen.
- FIG. 9 schematically illustrates the overall system architecture of the Sony® Playstation 3® entertainment device, a console having controllers for implementing an avatar control system in accordance with one embodiment of the present invention.
- a system unit 900 is provided, with various peripheral devices connectable to the system unit 9OO.
- the system unit 900 comprises: a Cell processor 928; a Rambus® dynamic random access memory (XDRAM) unit 926; a Reality Synthesizer graphics unit 930 with a dedicated video random access memory (VRAM) unit 932; and an I/O bridge 934.
- XDRAM Rambus® dynamic random access memory
- VRAM dedicated video random access memory
- the system unit 900 also comprises a BIu Ray® Disk BD-ROM® optical disk reader 940 for reading from a disk 940a and a removable slot-in hard disk drive (HDD) 936, accessible through the I/O bridge 934.
- the system unit 900 also comprises a memory card reader 938 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge 934.
- the I/O bridge 934 also connects to six Universal Serial Bus (USB) 2.0 ports 924; a gigabit Ethernet port 922; an IEEE 802.1 lb/g wireless network (Wi-Fi) port 920; and a Bluetooth® wireless link port 918 capable of supporting of up to seven Bluetooth connections.
- USB Universal Serial Bus
- the I/O bridge 934 handles all wireless, USB and Ethernet data, including data from one or more game controllers 902. For example when a user is playing a game, the I/O bridge 934 receives data from the game controller 902 via a Bluetooth link and directs it to the Cell processor 928, which updates the current state of the game accordingly.
- the wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 902, such as: a remote control 904; a keyboard 906; a mouse 908; a portable entertainment device 910 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 912; and a microphone headset 914.
- peripheral devices may therefore in principle be connected to the system unit 900 wirelessly; for example the portable entertainment device 910 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 914 may communicate via a Bluetooth link.
- Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
- DVRs digital video recorders
- set-top boxes digital cameras
- portable media players Portable media players
- Voice over IP telephones mobile telephones
- printers and scanners a legacy memory card reader 916 may be connected to the system unit via a USB port 924, enabling the reading of memory cards 948 of the kind used by the Playstation® or Playstation 2® devices.
- the game controller 902 is operable to communicate wirelessly with the system unit 900 via the Bluetooth link.
- the game controller 902 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 902.
- the game controller is sensitive to motion in six degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands.
- other wirelessly enabled peripheral devices such as the Playstation Portable device may be used as a controller.
- additional game or control information may be provided on the screen of the device.
- Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid- response quiz game (also not shown).
- the remote control 904 is also operable to communicate wirelessly with the system unit 900 via a Bluetooth link.
- the remote control 904 comprises controls suitable for the operation of the BIu Ray Disk BD-ROM reader 940 and for the navigation of disk content.
- the BIu Ray Disk BD-ROM reader 940 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs.
- the reader 940 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs.
- the reader 940 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional prerecorded and recordable Blu-Ray Disks.
- the system unit 900 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesizer graphics unit 930, through audio and video connectors to a display and sound output device 942 such as a monitor or television set having a display 944 and one or more loudspeakers 946.
- the audio connectors 950 may include conventional analogue and digital outputs whilst the video connectors 952 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 72Op, 108Oi or 1080p high definition.
- Audio processing is performed by the Cell processor 928.
- the Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray® disks.
- the video camera 912 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 900.
- the camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 900, for example to signify adverse lighting conditions.
- Embodiments of the video camera 912 may variously connect to the system unit 900 via a USB, Bluetooth or Wi-Fi communication port.
- Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data.
- the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs.
- a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 900
- an appropriate piece of software such as a device driver should be provided.
- Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
- FIG. 10 is a schematic of the Cell processor 928 in accordance with one embodiment of the present invention.
- the Cell processors 928 has an architecture comprising four basic components: external input and output structures comprising a memory controller 1060 and a dual bus interface controller 1070A,B; a main processor referred to as the Power Processing Element 1050; eight co-processors referred to as Synergistic Processing Elements (SPEs) 1010A-H; and a circular data bus connecting the above components referred to as the Element Interconnect Bus 1080.
- the total floating point performance of the Cell processor is 218 GFLOPS, compared with the 6.2 GFLOPs of the Playstation 2 device's Emotion Engine.
- the Power Processing Element (PPE) 1050 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 1055 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache and a 32 kB level 1 (Ll) cache.
- the PPE 1050 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz.
- the primary role of the PPE 1050 is to act as a controller for the Synergistic Processing Elements 101 OA-H, which handle most of the computational workload. In operation the PPE 1050 maintains a job queue, scheduling jobs for the Synergistic Processing Elements 1010A-H and monitoring their progress. Consequently each Synergistic Processing Element 1010A-H runs a kernel whose role is to fetch a job, execute it and synchronizes with the PPE 1050.
- Each Synergistic Processing Element (SPE) 101 OA-H comprises a respective Synergistic Processing Unit (SPU) 1020A-H, and a respective Memory Flow Controller (MFC) 1040 A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 1042 A-H, a respective Memory Management Unit (MMU) 1044 A-H and a bus interface (not shown).
- SPU 1020 A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 1030A-H, expandable in principle to 4 GB.
- Each SPE gives a theoretical 25.6 GFLOPS of single precision performance.
- An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation.
- the SPU 1020A-H does not directly access the system memory XDRAM 926; the 64-bit addresses formed by the SPU 1020 A-H are passed to the MFC 1040 A-H which instructs its DMA controller 1042 A-H to access memory via the Element Interconnect Bus 1080 and the memory controller 1060.
- the Element Interconnect Bus (EIB) 1080 is a logically circular communication bus internal to the Cell processor 928 which connects the above processor elements, namely the PPE 1050, the memory controller 1060, the dual bus interface 1070 A 5 B and the 8 SPEs 1010A-H, totaling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE 1010A-H comprises a DMAC 1042 A-H for scheduling longer read or write sequences.
- the EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step- wise data-flow between any two participants is six steps in the appropriate direction.
- the theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilization through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2GHz.
- the memory controller 1060 comprises an XDRAM interface 1062, developed by Rambus Incorporated.
- the memory controller interfaces with the Rambus XDRAM 926 with a theoretical peak bandwidth of 25.6 GB/s.
- the dual bus interface 1070 A 5 B comprises a Rambus FlexIO® system interface 1072A,B.
- the interface is organized into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller 170A and the Reality Simulator graphics unit 200 via controller 170B.
- Data sent by the Cell processor 928 to the Reality Simulator graphics unit 930 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
- Embodiments may include capturing depth data to better identify the real- world user and to direct activity of an avatar or scene.
- the object can be something the person is holding or can also be the person's hand.
- the terms "depth camera” and "three-dimensional camera” refer to any camera that is capable of obtaining distance or depth information as well as two-dimensional pixel information.
- a depth camera can utilize controlled infrared lighting to obtain distance information.
- Another exemplary depth camera can be a stereo camera pair, which triangulates distance information using two standard cameras.
- depth sensing device refers to any type of device that is capable of obtaining distance information as well as two- dimensional pixel information.
- new "depth cameras” provide the ability to capture and map the third-dimension in addition to normal two-dimensional video imagery. With the new depth data, embodiments of the present invention allow the placement of computer-generated objects in various positions within a video scene in real-time, including behind other objects.
- embodiments of the present invention provide real-time interactive gaming experiences for users.
- users can interact with various computer-generated objects in real-time.
- video scenes can be altered in realtime to enhance the user's game experience.
- computer generated costumes can be inserted over the user's clothing, and computer generated light sources can be utilized to project virtual shadows within a video scene.
- a depth camera captures two-dimensional data for a plurality of pixels that comprise the video image. These values are color values for the pixels, generally red, green, and blue (RGB) values for each pixel. In this manner, objects captured by the camera appear as two-dimension objects on a monitor.
- RGB red, green, and blue
- Embodiments of the present invention also contemplate distributed image processing configurations.
- the invention is not limited to the captured image and display image processing taking place in one or even two locations, such as in the CPU or in the CPU and one other element.
- the input image processing can just as readily take place in an associated CPU, processor or device that can perform processing; essentially all of image processing can be distributed throughout the interconnected system.
- the present invention is not limited to any specific image processing hardware circuitry and/or software.
- the embodiments described herein are also not limited to any specific combination of general hardware circuitry and/or software, nor to any particular source for the instructions executed by processing components.
- the invention may employ various computer-implemented operations involving data stored in computer systems. These operations include operations requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. Further, the manipulations performed are often referred to in terms, such as producing, identifying, determining, or comparing.
- the above described invention may be practiced with other computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like.
- the invention may also be practiced in distributing computing environments where tasks are performed by remote processing devices that are linked through a communications network.
- the invention can also be embodied as computer readable code on a computer readable medium.
- the computer readable medium is any data storage device that can store data which can be thereafter read by a computer system, including an electromagnetic wave carrier. Examples of the computer readable medium include hard drives, network attached storage (NAS), read-only memory, random-access memory, CD- ROMs, CD-Rs, CD-RWs, magnetic tapes, and other optical and non-optical data storage devices.
- the computer readable medium can also be distributed over a network coupled computer system so that the computer readable code is stored and executed in a distributed fashion.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Marketing (AREA)
- Economics (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Finance (AREA)
- Theoretical Computer Science (AREA)
- Strategic Management (AREA)
- Development Economics (AREA)
- Accounting & Taxation (AREA)
- Processing Or Creating Images (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Telephonic Communication Services (AREA)
- Storage Device Security (AREA)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP08726207A EP2126708A4 (en) | 2007-03-01 | 2008-02-27 | MONITORING USER COMMENTS AND REACTIONS IN A VIRTUAL WORLD |
JP2009551722A JP2010535362A (ja) | 2007-03-01 | 2008-02-27 | 仮想世界のユーザの意見および反応のモニタリング |
Applications Claiming Priority (14)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US89239707P | 2007-03-01 | 2007-03-01 | |
GBGB0703974.6A GB0703974D0 (en) | 2007-03-01 | 2007-03-01 | Entertainment device |
US60/892,397 | 2007-03-01 | ||
GB0703974.6 | 2007-03-01 | ||
GB0704246.8 | 2007-03-05 | ||
GB0704235.1 | 2007-03-05 | ||
GB0704225A GB2447094B (en) | 2007-03-01 | 2007-03-05 | Entertainment device and method |
GB0704227.8 | 2007-03-05 | ||
GB0704246A GB2447096B (en) | 2007-03-01 | 2007-03-05 | Entertainment device and method |
GB0704235A GB2447095B (en) | 2007-03-01 | 2007-03-05 | Entertainment device and method |
GB0704225.2 | 2007-03-05 | ||
GB0704227A GB2447020A (en) | 2007-03-01 | 2007-03-05 | Transmitting game data from an entertainment device and rendering that data in a virtual environment of a second entertainment device |
US11/789,326 | 2007-04-23 | ||
US11/789,326 US20080215975A1 (en) | 2007-03-01 | 2007-04-23 | Virtual world user opinion & response monitoring |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008108965A1 true WO2008108965A1 (en) | 2008-09-12 |
Family
ID=39738577
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2008/002630 WO2008108965A1 (en) | 2007-03-01 | 2008-02-27 | Virtual world user opinion & response monitoring |
Country Status (3)
Country | Link |
---|---|
EP (4) | EP2132650A4 (ja) |
JP (5) | JP2010533006A (ja) |
WO (1) | WO2008108965A1 (ja) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010176323A (ja) * | 2009-01-28 | 2010-08-12 | Nintendo Co Ltd | プログラム、情報処理装置、および情報処理システム |
EP2350967A2 (en) * | 2008-10-16 | 2011-08-03 | Nc Interactive, Inc. | Interactive network game and methods thereof |
EP2350968A2 (en) * | 2008-10-15 | 2011-08-03 | Nc Interactive, Inc. | Interactive network game and methods thereof |
US20120192088A1 (en) * | 2011-01-20 | 2012-07-26 | Avaya Inc. | Method and system for physical mapping in a virtual world |
US9199171B2 (en) | 2009-01-28 | 2015-12-01 | Nintendo Co., Ltd. | Information processing system relating to content distribution, storage medium for storing program directed thereto, and information processing device |
EP3015145A1 (en) * | 2014-10-31 | 2016-05-04 | Samsung Electronics Co., Ltd. | Device and method of managing user information based on image |
US9415302B2 (en) | 2009-01-28 | 2016-08-16 | Nintendo Co., Ltd. | Storage medium for storing program capable of improving degree of freedom and effect of content provided by sponsor and information processing device |
US9492754B2 (en) | 2009-01-28 | 2016-11-15 | Nintendo Co., Ltd. | Method, system, and storage medium for displaying distributed media content in a calendar screen |
US9786125B2 (en) * | 2015-06-17 | 2017-10-10 | Facebook, Inc. | Determining appearances of objects in a virtual world based on sponsorship of object appearances |
US10339592B2 (en) | 2015-06-17 | 2019-07-02 | Facebook, Inc. | Configuring a virtual store based on information associated with a user by an online system |
US10489795B2 (en) | 2007-04-23 | 2019-11-26 | The Nielsen Company (Us), Llc | Determining relative effectiveness of media content items |
US10515474B2 (en) | 2017-01-19 | 2019-12-24 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression in a virtual reality system |
US10521014B2 (en) | 2017-01-19 | 2019-12-31 | Mindmaze Holding Sa | Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location in at least one of a virtual and augmented reality system |
US10861056B2 (en) | 2015-06-17 | 2020-12-08 | Facebook, Inc. | Placing locations in a virtual world |
US10916057B2 (en) | 2014-09-11 | 2021-02-09 | Nokia Technologies Oy | Method, apparatus and computer program for displaying an image of a real world object in a virtual reality enviroment |
US10943100B2 (en) | 2017-01-19 | 2021-03-09 | Mindmaze Holding Sa | Systems, methods, devices and apparatuses for detecting facial expression |
US11328533B1 (en) | 2018-01-09 | 2022-05-10 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression for motion capture |
US11651562B2 (en) | 2019-12-30 | 2023-05-16 | Tmrw Foundation Ip S. À R.L. | Method and system for enabling enhanced user-to-user communication in digital realities |
US11991344B2 (en) | 2017-02-07 | 2024-05-21 | Mindmaze Group Sa | Systems, methods and apparatuses for stereo vision and tracking |
Families Citing this family (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8397168B2 (en) | 2008-04-05 | 2013-03-12 | Social Communications Company | Interfacing with a spatial virtual communication environment |
US8407605B2 (en) | 2009-04-03 | 2013-03-26 | Social Communications Company | Application sharing |
US7769806B2 (en) | 2007-10-24 | 2010-08-03 | Social Communications Company | Automated real-time data stream switching in a shared virtual area communication environment |
EP2279472A4 (en) | 2008-04-05 | 2013-11-20 | Social Communications Co | APPARATUS AND METHODS BASED ON A SHARED VIRTUAL SPACE COMMUNICATION ENVIRONMENT |
US9853922B2 (en) | 2012-02-24 | 2017-12-26 | Sococo, Inc. | Virtual area communications |
US9542010B2 (en) * | 2009-09-15 | 2017-01-10 | Palo Alto Research Center Incorporated | System for interacting with objects in a virtual environment |
EP2485131A4 (en) * | 2009-09-30 | 2013-03-06 | Rakuten Inc | OBJECT TRANSFER PROCESS FOR ONE WEBSITE |
CN106964150B (zh) * | 2011-02-11 | 2021-03-02 | 漳州市爵晟电子科技有限公司 | 一种动作定位点控制系统及其穿套式定点控制设备 |
US20120277001A1 (en) * | 2011-04-28 | 2012-11-01 | Microsoft Corporation | Manual and Camera-based Game Control |
JP2013003778A (ja) * | 2011-06-15 | 2013-01-07 | Forum8 Co Ltd | 3次元空間情報処理システム、3次元空間情報処理端末、3次元空間情報処理サーバ、3次元空間情報処理端末プログラム、3次元空間情報処理サーバプログラム、及び3次元空間情報処理方法 |
WO2013181026A1 (en) | 2012-06-02 | 2013-12-05 | Social Communications Company | Interfacing with a spatial virtual communications environment |
CN104516618B (zh) * | 2013-09-27 | 2020-01-14 | 中兴通讯股份有限公司 | 界面功能解析显示方法及装置 |
JP6091407B2 (ja) * | 2013-12-18 | 2017-03-08 | 三菱電機株式会社 | ジェスチャ登録装置 |
WO2015163913A1 (en) * | 2014-04-25 | 2015-10-29 | Nokia Technologies Oy | Interaction between virtual reality entities and real entities |
CN109074397B (zh) | 2016-05-06 | 2022-04-15 | 索尼公司 | 信息处理系统和信息处理方法 |
JP6263252B1 (ja) * | 2016-12-06 | 2018-01-17 | 株式会社コロプラ | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム |
WO2018139203A1 (ja) | 2017-01-26 | 2018-08-02 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP6821461B2 (ja) * | 2017-02-08 | 2021-01-27 | 株式会社コロプラ | 仮想空間を介して通信するためにコンピュータで実行される方法、当該方法をコンピュータに実行させるプログラム、および、情報制御装置 |
CN110313019B (zh) | 2017-02-24 | 2023-07-04 | 索尼公司 | 信息处理设备、信息处理方法和计算机可读介质 |
JP6651479B2 (ja) * | 2017-03-16 | 2020-02-19 | 株式会社コロプラ | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるプログラム |
JP7308573B2 (ja) * | 2018-05-24 | 2023-07-14 | 株式会社ユピテル | システム及びプログラム等 |
JP7302956B2 (ja) * | 2018-09-19 | 2023-07-04 | 株式会社バンダイナムコエンターテインメント | コンピュータシステム、ゲームシステム及びプログラム |
JP2019130295A (ja) * | 2018-12-28 | 2019-08-08 | ノキア テクノロジーズ オサケユイチア | 仮想現実エンティティと現実エンティティの間のインタラクション |
JP7323315B2 (ja) | 2019-03-27 | 2023-08-08 | 株式会社コーエーテクモゲームス | 情報処理装置、情報処理方法及びプログラム |
WO2021033261A1 (ja) * | 2019-08-20 | 2021-02-25 | 日本たばこ産業株式会社 | コミュニケーション支援方法、プログラムおよびコミュニケーションサーバ |
WO2021033254A1 (ja) * | 2019-08-20 | 2021-02-25 | 日本たばこ産業株式会社 | コミュニケーション支援方法、プログラムおよびコミュニケーションサーバ |
US11080930B2 (en) * | 2019-10-23 | 2021-08-03 | Skonec Entertainment Co., Ltd. | Virtual reality control system |
JP2020146469A (ja) * | 2020-04-20 | 2020-09-17 | 株式会社トプコン | 眼科検査システム及び眼科検査装置 |
JP6932224B1 (ja) | 2020-06-01 | 2021-09-08 | 株式会社電通 | 広告表示システム |
JP7254112B2 (ja) * | 2021-03-19 | 2023-04-07 | 本田技研工業株式会社 | 仮想体験提供装置、仮想体験提供方法、及びプログラム |
WO2023281755A1 (ja) * | 2021-07-09 | 2023-01-12 | シャープNecディスプレイソリューションズ株式会社 | 表示制御装置、表示制御方法、及びプログラム |
JPWO2023068067A1 (ja) * | 2021-10-18 | 2023-04-27 | ||
JPWO2023149255A1 (ja) * | 2022-02-02 | 2023-08-10 | ||
KR20230173481A (ko) * | 2022-06-17 | 2023-12-27 | 주식회사 메타캠프 | 다채널 구조 및 채널 동기화를 위한 메타버스 서비스장치 및 그 장치의 구동방법 |
WO2024004609A1 (ja) * | 2022-06-28 | 2024-01-04 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、および記録媒体 |
JP2024135824A (ja) | 2023-03-23 | 2024-10-04 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP7527430B1 (ja) * | 2023-03-29 | 2024-08-02 | 株式会社バンダイ | プログラム及び情報処理装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010018658A1 (en) * | 2000-02-26 | 2001-08-30 | Kim Jong Min | System for obtaining information based on communication of users |
US20050143138A1 (en) * | 2003-09-05 | 2005-06-30 | Samsung Electronics Co., Ltd. | Proactive user interface including emotional agent |
US20060184355A1 (en) * | 2003-03-25 | 2006-08-17 | Daniel Ballin | Behavioural translator for an object |
Family Cites Families (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5734373A (en) * | 1993-07-16 | 1998-03-31 | Immersion Human Interface Corporation | Method and apparatus for controlling force feedback interface systems utilizing a host computer |
CA2141144A1 (en) * | 1994-03-31 | 1995-10-01 | Joseph Desimone | Electronic game utilizing bio-signals |
GB9505916D0 (en) * | 1995-03-23 | 1995-05-10 | Norton John M | Controller |
JP3091135B2 (ja) * | 1995-05-26 | 2000-09-25 | 株式会社バンダイ | ゲーム装置 |
US5823879A (en) * | 1996-01-19 | 1998-10-20 | Sheldon F. Goldberg | Network gaming system |
JP3274603B2 (ja) * | 1996-04-18 | 2002-04-15 | エヌイーシーソフト株式会社 | 音声集計システムおよび音声集計方法 |
JP3975511B2 (ja) * | 1997-07-25 | 2007-09-12 | 富士通株式会社 | パーソナル通信分散制御方式 |
JP3757584B2 (ja) * | 1997-11-20 | 2006-03-22 | 株式会社富士通ゼネラル | 広告効果確認システム |
JP3276068B2 (ja) * | 1997-11-28 | 2002-04-22 | インターナショナル・ビジネス・マシーンズ・コーポレーション | オブジェクトの選択方法およびそのシステム |
US6195104B1 (en) * | 1997-12-23 | 2001-02-27 | Philips Electronics North America Corp. | System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs |
JP2000187435A (ja) * | 1998-12-24 | 2000-07-04 | Sony Corp | 情報処理装置、携帯機器、電子ペット装置、情報処理手順を記録した記録媒体及び情報処理方法 |
JP2000311251A (ja) * | 1999-02-26 | 2000-11-07 | Toshiba Corp | アニメーション作成装置および方法、記憶媒体 |
AU4473000A (en) * | 1999-04-20 | 2000-11-02 | John Warren Stringer | Human gestural input device with motion and pressure |
JP4034002B2 (ja) * | 1999-04-22 | 2008-01-16 | 三菱電機株式会社 | 分散仮想空間情報管理伝送方法 |
AU5012300A (en) * | 1999-05-14 | 2000-12-05 | Graphic Gems | Method and apparatus for registering lots in a shared virtual world |
AU5012600A (en) * | 1999-05-14 | 2000-12-05 | Graphic Gems | Method and apparatus for a multi-owner, three-dimensional virtual world |
JP2000325653A (ja) * | 1999-05-19 | 2000-11-28 | Enix Corp | 携帯型ビデオゲーム装置およびプログラムを格納した記録媒体 |
JP2001154966A (ja) * | 1999-11-29 | 2001-06-08 | Sony Corp | コンピュータ・ネットワーク上で構築・提供される共有仮想空間上で複数ユーザが参加可能な仮想会話を支援する会話支援システム及び会話支援方法、並びに、プログラム記憶媒体 |
JP2001153663A (ja) * | 1999-11-29 | 2001-06-08 | Canon Inc | 物体の移動方向判別装置及びそれを備える撮影装置、ナビゲーションシステム、サスペンションシステム、ゲームシステム並びにリモートコントローラシステム |
JP3623415B2 (ja) * | 1999-12-02 | 2005-02-23 | 日本電信電話株式会社 | 仮想空間通信システムにおけるアバタ表示装置、アバタ表示方法および記憶媒体 |
JP2001236290A (ja) * | 2000-02-22 | 2001-08-31 | Toshinao Komuro | アバタを利用したコミュニケーション・システム |
JP2001325501A (ja) * | 2000-03-10 | 2001-11-22 | Heart Gift:Kk | オンラインギフト方法 |
JP3458090B2 (ja) * | 2000-03-15 | 2003-10-20 | コナミ株式会社 | メッセージ交換機能を備えたゲームシステム、そのゲームシステムで使用するゲーム装置、メッセージ交換システム、およびコンピュータ読取可能な記憶媒体 |
JP2001321568A (ja) * | 2000-05-18 | 2001-11-20 | Casio Comput Co Ltd | ゲーム装置、ゲーム方法及び情報記録媒体 |
TWI221574B (en) * | 2000-09-13 | 2004-10-01 | Agi Inc | Sentiment sensing method, perception generation method and device thereof and software |
JP2002136762A (ja) * | 2000-11-02 | 2002-05-14 | Taito Corp | 潜在映像を用いたアドベンチャーゲーム |
JP3641423B2 (ja) * | 2000-11-17 | 2005-04-20 | Necインフロンティア株式会社 | 広告情報提供システム |
AU2002219857A1 (en) * | 2000-11-27 | 2002-06-03 | Butterfly.Net, Inc. | System and method for synthesizing environments to facilitate distributed, context-sensitive, multi-user interactive applications |
EP1216733A3 (en) * | 2000-12-20 | 2004-09-08 | Aruze Co., Ltd. | Server providing competitive game service, program storage medium for use in the server, and method of providing competitive game service using the server |
JP2002197376A (ja) * | 2000-12-27 | 2002-07-12 | Fujitsu Ltd | ユーザに応じてカストマイズされた仮想世界を提供する方法および装置 |
JP4613295B2 (ja) * | 2001-02-16 | 2011-01-12 | 株式会社アートディンク | 仮想現実再生装置 |
JP2005500912A (ja) * | 2001-02-27 | 2005-01-13 | アンソロトロニックス インコーポレイテッド | ロボット装置および無線通信システム |
US7667705B2 (en) * | 2001-05-15 | 2010-02-23 | Nintendo Of America Inc. | System and method for controlling animation by tagging objects within a game environment |
JP4068542B2 (ja) * | 2001-05-18 | 2008-03-26 | 株式会社ソニー・コンピュータエンタテインメント | エンタテインメントシステム、通信プログラム、通信プログラムを格納したコンピュータ読み取り可能な記録媒体、及び通信方法 |
JP3425562B2 (ja) * | 2001-07-12 | 2003-07-14 | コナミ株式会社 | キャラクタ操作プログラム、キャラクタ操作方法及びビデオゲーム装置 |
JP3732168B2 (ja) * | 2001-12-18 | 2006-01-05 | 株式会社ソニー・コンピュータエンタテインメント | 仮想世界におけるオブジェクトの表示装置、表示システム及び表示方法、ならびにそれらを利用可能な仮想世界における地価及び広告料の設定方法 |
JP2003210834A (ja) * | 2002-01-17 | 2003-07-29 | Namco Ltd | 制御情報、情報記憶媒体、およびゲーム装置 |
JP2003259331A (ja) * | 2002-03-06 | 2003-09-12 | Nippon Telegraph & Telephone West Corp | 3次元コンテンツ配信装置、3次元コンテンツ配信プログラム、プログラム記録媒体、及び3次元コンテンツ配信方法 |
JP2003324522A (ja) * | 2002-05-02 | 2003-11-14 | Nippon Telegr & Teleph Corp <Ntt> | Ip/pstn統合制御装置、通信方法、プログラムおよび記録媒体 |
JP2004021606A (ja) * | 2002-06-17 | 2004-01-22 | Nec Corp | 仮想空間提供サーバを用いたインターネットサービス提供システム |
JP2004046311A (ja) * | 2002-07-09 | 2004-02-12 | Nippon Telegr & Teleph Corp <Ntt> | 3次元仮想空間におけるジェスチャ入力方法およびその装置 |
US20040029625A1 (en) * | 2002-08-07 | 2004-02-12 | Ed Annunziata | Group behavioral modification using external stimuli |
US20060158515A1 (en) * | 2002-11-07 | 2006-07-20 | Sorensen Christopher D | Adaptive motion detection interface and motion detector |
JP3952396B2 (ja) * | 2002-11-20 | 2007-08-01 | 任天堂株式会社 | ゲーム装置および情報処理装置 |
JP2004237022A (ja) * | 2002-12-11 | 2004-08-26 | Sony Corp | 情報処理装置および方法、プログラム、並びに記録媒体 |
JP3961419B2 (ja) * | 2002-12-27 | 2007-08-22 | 株式会社バンダイナムコゲームス | ゲーム装置、ゲーム制御プログラムおよびそのプログラムが記録された記録媒体 |
JP4442117B2 (ja) * | 2003-05-27 | 2010-03-31 | ソニー株式会社 | 情報登録方法、情報登録装置、および情報登録プログラム |
JP2005100053A (ja) * | 2003-09-24 | 2005-04-14 | Nomura Research Institute Ltd | アバター情報送受信方法、プログラム及び装置 |
JP2005216004A (ja) * | 2004-01-29 | 2005-08-11 | Tama Tlo Kk | プログラムおよび通信方法 |
JP4559092B2 (ja) * | 2004-01-30 | 2010-10-06 | 株式会社エヌ・ティ・ティ・ドコモ | 携帯通信端末及びプログラム |
US20060013254A1 (en) * | 2004-06-07 | 2006-01-19 | Oded Shmueli | System and method for routing communication through various communication channel types |
JP2006034436A (ja) * | 2004-07-23 | 2006-02-09 | Smk Corp | エキササイズ器具を用いたバーチャルゲームシステム |
CA2582548A1 (en) * | 2004-10-08 | 2006-04-20 | Sonus Networks, Inc. | Common telephony services to multiple devices associated with multiple networks |
US20090005167A1 (en) * | 2004-11-29 | 2009-01-01 | Juha Arrasvuori | Mobile Gaming with External Devices in Single and Multiplayer Games |
JP2006186893A (ja) * | 2004-12-28 | 2006-07-13 | Matsushita Electric Ind Co Ltd | 音声対話制御装置 |
JP2006185252A (ja) * | 2004-12-28 | 2006-07-13 | Univ Of Electro-Communications | インタフェース装置 |
JP2006211005A (ja) * | 2005-01-25 | 2006-08-10 | Takashi Uchiyama | テレビ電話広告システム |
JPWO2006080080A1 (ja) * | 2005-01-28 | 2008-06-19 | 富士通株式会社 | 電話管理システム、電話管理方法、および電話管理プログラム |
JP4322833B2 (ja) * | 2005-03-16 | 2009-09-02 | 株式会社東芝 | 無線通信システム |
EP1878013B1 (en) * | 2005-05-05 | 2010-12-15 | Sony Computer Entertainment Inc. | Video game control with joystick |
US20060252538A1 (en) * | 2005-05-05 | 2006-11-09 | Electronic Arts Inc. | Analog stick input replacement for lengthy button push sequences and intuitive input for effecting character actions |
JP2006004421A (ja) * | 2005-06-03 | 2006-01-05 | Sony Corp | データ処理装置 |
US20070002835A1 (en) * | 2005-07-01 | 2007-01-04 | Microsoft Corporation | Edge-based communication |
-
2008
- 2008-02-26 EP EP08730776A patent/EP2132650A4/en not_active Ceased
- 2008-02-26 JP JP2009551806A patent/JP2010533006A/ja active Pending
- 2008-02-27 JP JP2009551726A patent/JP2010535363A/ja active Pending
- 2008-02-27 WO PCT/US2008/002630 patent/WO2008108965A1/en active Application Filing
- 2008-02-27 EP EP08726220A patent/EP2118840A4/en not_active Withdrawn
- 2008-02-27 EP EP08726219A patent/EP2118757A4/en not_active Withdrawn
- 2008-02-27 EP EP08726207A patent/EP2126708A4/en not_active Ceased
- 2008-02-27 JP JP2009551727A patent/JP2010535364A/ja active Pending
- 2008-02-27 JP JP2009551722A patent/JP2010535362A/ja active Pending
-
2014
- 2014-02-28 JP JP2014039137A patent/JP5756198B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010018658A1 (en) * | 2000-02-26 | 2001-08-30 | Kim Jong Min | System for obtaining information based on communication of users |
US20060184355A1 (en) * | 2003-03-25 | 2006-08-17 | Daniel Ballin | Behavioural translator for an object |
US20050143138A1 (en) * | 2003-09-05 | 2005-06-30 | Samsung Electronics Co., Ltd. | Proactive user interface including emotional agent |
Non-Patent Citations (1)
Title |
---|
See also references of EP2126708A4 * |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11222344B2 (en) | 2007-04-23 | 2022-01-11 | The Nielsen Company (Us), Llc | Determining relative effectiveness of media content items |
US10489795B2 (en) | 2007-04-23 | 2019-11-26 | The Nielsen Company (Us), Llc | Determining relative effectiveness of media content items |
EP2350968A2 (en) * | 2008-10-15 | 2011-08-03 | Nc Interactive, Inc. | Interactive network game and methods thereof |
EP2350968A4 (en) * | 2008-10-15 | 2013-10-09 | Nc Interactive Inc | INTERACTIVE NETWORK GAME AND METHOD THEREFOR |
EP2350967A2 (en) * | 2008-10-16 | 2011-08-03 | Nc Interactive, Inc. | Interactive network game and methods thereof |
EP2350967A4 (en) * | 2008-10-16 | 2013-08-14 | Nc Interactive Inc | INTERACTIVE NETWORK GAME AND METHOD THEREFOR |
US9492754B2 (en) | 2009-01-28 | 2016-11-15 | Nintendo Co., Ltd. | Method, system, and storage medium for displaying distributed media content in a calendar screen |
JP2010176323A (ja) * | 2009-01-28 | 2010-08-12 | Nintendo Co Ltd | プログラム、情報処理装置、および情報処理システム |
US9415302B2 (en) | 2009-01-28 | 2016-08-16 | Nintendo Co., Ltd. | Storage medium for storing program capable of improving degree of freedom and effect of content provided by sponsor and information processing device |
US10311447B2 (en) | 2009-01-28 | 2019-06-04 | Nintendo Co., Ltd. | Storage medium for storing program capable of ensuring that evaluation of content is made after watching thereof, information processing device, and information processing system |
US9199171B2 (en) | 2009-01-28 | 2015-12-01 | Nintendo Co., Ltd. | Information processing system relating to content distribution, storage medium for storing program directed thereto, and information processing device |
US9827497B2 (en) | 2009-01-28 | 2017-11-28 | Nintendo Co., Ltd. | Information processing system relating to content distribution, storage medium for storing program directed thereto, and information processing device |
US20120192088A1 (en) * | 2011-01-20 | 2012-07-26 | Avaya Inc. | Method and system for physical mapping in a virtual world |
US10916057B2 (en) | 2014-09-11 | 2021-02-09 | Nokia Technologies Oy | Method, apparatus and computer program for displaying an image of a real world object in a virtual reality enviroment |
EP3015145A1 (en) * | 2014-10-31 | 2016-05-04 | Samsung Electronics Co., Ltd. | Device and method of managing user information based on image |
US11024070B2 (en) | 2014-10-31 | 2021-06-01 | Samsung Electronics Co., Ltd. | Device and method of managing user information based on image |
US10096143B2 (en) | 2014-10-31 | 2018-10-09 | Samsung Electronics Co., Ltd. | Device and method of managing user information based on image |
US10192403B2 (en) | 2015-06-17 | 2019-01-29 | Facebook, Inc. | Determining appearances of objects in a virtual world based on sponsorship of object appearances |
US9786125B2 (en) * | 2015-06-17 | 2017-10-10 | Facebook, Inc. | Determining appearances of objects in a virtual world based on sponsorship of object appearances |
US10339592B2 (en) | 2015-06-17 | 2019-07-02 | Facebook, Inc. | Configuring a virtual store based on information associated with a user by an online system |
US10861056B2 (en) | 2015-06-17 | 2020-12-08 | Facebook, Inc. | Placing locations in a virtual world |
US11195316B2 (en) | 2017-01-19 | 2021-12-07 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression in a virtual reality system |
US10943100B2 (en) | 2017-01-19 | 2021-03-09 | Mindmaze Holding Sa | Systems, methods, devices and apparatuses for detecting facial expression |
US10521014B2 (en) | 2017-01-19 | 2019-12-31 | Mindmaze Holding Sa | Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location in at least one of a virtual and augmented reality system |
US10515474B2 (en) | 2017-01-19 | 2019-12-24 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression in a virtual reality system |
US11495053B2 (en) | 2017-01-19 | 2022-11-08 | Mindmaze Group Sa | Systems, methods, devices and apparatuses for detecting facial expression |
US11709548B2 (en) | 2017-01-19 | 2023-07-25 | Mindmaze Group Sa | Systems, methods, devices and apparatuses for detecting facial expression |
US11989340B2 (en) | 2017-01-19 | 2024-05-21 | Mindmaze Group Sa | Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location in at least one of a virtual and augmented reality system |
US11991344B2 (en) | 2017-02-07 | 2024-05-21 | Mindmaze Group Sa | Systems, methods and apparatuses for stereo vision and tracking |
US11328533B1 (en) | 2018-01-09 | 2022-05-10 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression for motion capture |
US11651562B2 (en) | 2019-12-30 | 2023-05-16 | Tmrw Foundation Ip S. À R.L. | Method and system for enabling enhanced user-to-user communication in digital realities |
Also Published As
Publication number | Publication date |
---|---|
EP2126708A1 (en) | 2009-12-02 |
EP2118757A1 (en) | 2009-11-18 |
JP2010535363A (ja) | 2010-11-18 |
JP2010535362A (ja) | 2010-11-18 |
EP2132650A4 (en) | 2010-10-27 |
JP2010533006A (ja) | 2010-10-21 |
EP2126708A4 (en) | 2010-11-17 |
EP2118840A1 (en) | 2009-11-18 |
EP2132650A2 (en) | 2009-12-16 |
JP2014149836A (ja) | 2014-08-21 |
JP5756198B2 (ja) | 2015-07-29 |
JP2010535364A (ja) | 2010-11-18 |
EP2118840A4 (en) | 2010-11-10 |
EP2118757A4 (en) | 2010-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080215975A1 (en) | Virtual world user opinion & response monitoring | |
WO2008108965A1 (en) | Virtual world user opinion & response monitoring | |
US11442532B2 (en) | Control of personal space content presented via head mounted display | |
US10636217B2 (en) | Integration of tracked facial features for VR users in virtual reality environments | |
US11161236B2 (en) | Robot as personal trainer | |
US8766983B2 (en) | Methods and systems for processing an interchange of real time effects during video communication | |
JP6679747B2 (ja) | 仮想現実(vr)ユーザインタラクティブ性に関連付けられた仮想現実環境の観戦 | |
US10430018B2 (en) | Systems and methods for providing user tagging of content within a virtual scene | |
WO2008106196A1 (en) | Virtual world avatar control, interactivity and communication interactive messaging | |
US20100060662A1 (en) | Visual identifiers for virtual world avatars | |
CN115068932A (zh) | 虚拟现实环境中的视图位置处的观众管理 | |
US20140179427A1 (en) | Generation of a mult-part mini-game for cloud-gaming based on recorded gameplay | |
US10841247B2 (en) | Social media connection for a robot | |
WO2008106197A1 (en) | Interactive user controlled avatar animations | |
WO2018005199A1 (en) | Systems and methods for providing user tagging of content within a virtual scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08726207 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009551722 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008726207 Country of ref document: EP |