WO2014136466A1 - 情報処理装置、システム、情報処理方法およびプログラム - Google Patents
情報処理装置、システム、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2014136466A1 WO2014136466A1 PCT/JP2014/050108 JP2014050108W WO2014136466A1 WO 2014136466 A1 WO2014136466 A1 WO 2014136466A1 JP 2014050108 W JP2014050108 W JP 2014050108W WO 2014136466 A1 WO2014136466 A1 WO 2014136466A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- information
- users
- virtual space
- information processing
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 78
- 238000003672 processing method Methods 0.000 title claims description 7
- 230000009471 action Effects 0.000 claims abstract description 135
- 238000012545 processing Methods 0.000 claims description 26
- 230000008859 change Effects 0.000 claims description 14
- 230000006399 behavior Effects 0.000 description 150
- 238000004891 communication Methods 0.000 description 43
- 238000010586 diagram Methods 0.000 description 31
- 230000006870 function Effects 0.000 description 27
- 238000000034 method Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000001133 acceleration Effects 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000003203 everyday effect Effects 0.000 description 4
- 239000000047 product Substances 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000002354 daily effect Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/50—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
- A63F2300/55—Details of game data or player data management
- A63F2300/5546—Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
- A63F2300/5553—Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
Definitions
- the present disclosure relates to an information processing apparatus, a system, an information processing method, and a program.
- Patent Document 1 the pose of an object indicating another user in the virtual space displayed on the screen is walking, running, sitting, standing, calling, etc.
- a technique is described in which the actual behavior of each user is reflected, and the closeness between the user and each user, the actual location of each user, and the characteristics and preferences of each user are represented by the display position of the object.
- Patent Document 1 it cannot be said that the technology described in Patent Document 1, for example, has been sufficiently developed, and it can be said that there is still room for improvement in terms of the ease of use of the user interface.
- the 1st information acquisition part which acquires the 1st information which shows the action of 1 or a plurality of users, and the 2nd information different from the 1st information about the 1 or a plurality of users is acquired
- a second information acquisition unit configured to be configured, a user object configured based on the first information and representing each of the one or more users, and a virtual configured configured configured based on the second information and the user object arranged
- An information processing apparatus including a display control unit that displays a space on a display unit is provided.
- the terminal device includes one or more server devices that provide services to the terminal device, and the terminal device and the one or more server devices cooperate with each other.
- a system is provided that provides a function for displaying on a display unit a user object that represents each of the one or more users and a virtual space that is configured based on the second information and in which the user object is arranged.
- acquiring the 1st information which shows the action of one or a plurality of users, acquiring the 2nd information different from the 1st information about the 1 or a plurality of users, And a user object configured based on the first information and representing each of the one or more users, and a virtual space configured based on the second information and in which the user object is arranged are displayed on the display unit.
- the function to acquire the 1st information which shows the action of one or a plurality of users the function to acquire the 2nd information different from the 1st information about the 1 or a plurality of users, And a user object configured based on the first information and representing each of the one or more users, and a virtual space configured based on the second information and in which the user object is arranged are displayed on the display unit.
- a program for causing a computer to realize the function is provided.
- the behavior of each user is reflected in the user object, and the virtual space in which the user object is arranged is configured based on some information about each user.
- user attributes and behavior characteristics displayed as user objects can be grasped from the virtual space that is the background of the user object, and user behavior information is shared via the user object.
- the usability of the user interface is improved.
- FIG. 1 is a block diagram schematically illustrating a first embodiment of the present disclosure.
- FIG. FIG. 6 is a block diagram schematically illustrating a second embodiment of the present disclosure.
- FIG. 6 is a block diagram schematically illustrating a third embodiment of the present disclosure. It is a figure for demonstrating sharing of the action information in some embodiment of this indication. It is a figure which shows the example of a display for sharing of the behavior information shown in FIG. It is a figure for demonstrating the action status display in some embodiment of this indication.
- FIG. 7 is a diagram illustrating an example of batch selection in the example of FIG. 6. 6 is a diagram for describing a first example of setting a virtual space in some embodiments of the present disclosure.
- FIG. 6 is a diagram for describing a second example of setting a virtual space in some embodiments of the present disclosure.
- FIG. FIG. 10 is a diagram for describing a third example of setting a virtual space in some embodiments of the present disclosure.
- FIG. 6 is a diagram for describing an example of an operation of a virtual space in some embodiments of the present disclosure.
- FIG. 10 is a diagram for describing a first example of an operation on an object according to some embodiments of the present disclosure.
- FIG. 10 is a diagram for describing a second example of an operation on an object in some embodiments of the present disclosure.
- FIG. 14 is a diagram for describing a third example of an operation on an object in some embodiments of the present disclosure.
- FIG. 14 is a diagram for describing a third example of an operation on an object in some embodiments of the present disclosure. It is a figure which shows the outline
- FIG. 15 is a diagram showing a second example of recommended group display in the user grouping operation shown in FIG. 14. It is a flowchart which shows the process of the example shown in FIG. It is a block diagram for demonstrating the hardware constitutions of information processing apparatus.
- FIG. 1 is a block diagram schematically illustrating a first embodiment of the present disclosure.
- a system 10 includes a terminal device 100 and a server 200.
- the terminal device 100 is a terminal device carried by a user, such as a smartphone, a tablet terminal, a portable game machine, or a portable media player, and can be realized by, for example, a hardware configuration of an information processing device described later.
- the server 200 communicates with one or a plurality of terminal devices 100 via various wired or wireless networks, and provides various services to the terminal devices 100.
- the server 200 is realized by a single server device or a plurality of server devices that are connected to and cooperate with each other via a network.
- the server device can be realized by, for example, a hardware configuration of an information processing device described later.
- the terminal device 100 includes a sensor 110, an action recognition unit 120, a communication unit 130, a control unit 140, and a UI (user interface) 150 as functional configurations.
- the sensor 110 is, for example, various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, and an atmospheric pressure sensor, and detects the acceleration, posture, orientation, surrounding environment, and the like of the user of the terminal device 100. Further, the sensor 110 may include positioning means such as a GPS sensor or a Wi-Fi communication unit for acquiring user position information.
- various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, and an atmospheric pressure sensor, and detects the acceleration, posture, orientation, surrounding environment, and the like of the user of the terminal device 100.
- the sensor 110 may include positioning means such as a GPS sensor or a Wi-Fi communication unit for acquiring user position information.
- the behavior recognition unit 120 is realized in software by a processor such as a CPU operating according to a program.
- the functional configuration described as being realized in software in this specification including the action recognition unit 120 may be realized by a processing circuit such as an ASIC (Application Specific Integrated Circuit).
- the behavior recognition unit 120 recognizes the behavior of the user of the terminal device 100 based on the detection result of the sensor 110, and acquires information (behavior information) indicating the behavior of the user. For example, the action recognition unit 120 may recognize whether the user is stopped, walking, or running based on a detection result such as acceleration. The action recognition unit 120 also recognizes more advanced actions such as whether the user is working, at home, or shopping by combining the detection result such as acceleration and the user's position information. May be.
- the behavior recognition unit 120 transmits information indicating the recognized user behavior to the server 200.
- the action recognition unit 120 may also provide information indicating the user's action to the control unit 140.
- the communication unit 130 is realized by various wired or wireless communication devices, for example.
- the communication unit 130 receives one or more user behavior information from the server 200.
- the user whose behavior information is received may include the user of the terminal device 100, or may include one or a plurality of users (other users) different from the user of the terminal device 100.
- the behavior information of other users can be acquired by the sensor 110 and the behavior recognition unit 120 of another terminal device 100 and transmitted to the server 200, for example.
- the communication unit 130 may receive information different from the behavior information, for example, information indicating the attributes of each user, or information obtained by statistically processing the behavior information of each user from the server 200. Good.
- the communication unit 130 provides the received information to the control unit 140.
- the control unit 140 is realized in software by a processor such as a CPU operating according to a program.
- the control unit 140 causes the display unit included in the UI 150 to display an image based on the information acquired from the communication unit 130.
- the image includes a still image and a moving image.
- the control unit 140 may output sound from a speaker included in the UI 150 based on the information acquired from the communication unit 130.
- the image displayed on the UI 150 by the control unit 140 includes a user object and a virtual space in which the user object is arranged. Further, the control unit 140 may acquire a user's operation via an operation unit included in the UI 150, and may change an image to be displayed based on the operation.
- control unit 140 may perform an operation on the user displayed by the user object, for example, a user grouping, based on the user operation. Moreover, the control part 140 may identify action status based on each user's action information, and may reflect action status in a user object.
- the user object is an object representing one or a plurality of users, and is configured based on the behavior information of each user received by the communication unit 130 and / or the behavior information provided from the behavior recognition unit 120. That is, the user object is displayed for the user for whom the behavior information is provided. Note that the user object does not necessarily need to be displayed for all of the users for whom the behavior information is provided. For example, only the users specified by the user of the terminal device 100 such as the users classified into a specific group are displayed. May be.
- the virtual space is configured based on information different from behavior information.
- This information may be information indicating the attribute of each user, for example, or may be information obtained by statistically processing each user's behavior information.
- the control unit 140 combines the above information for each user displayed as a user object, and configures an image to be displayed as a common virtual space for each user object.
- the image displayed as the virtual space may be selected, for example, from preset images, or may be newly generated by converting numerical values extracted from information about each user according to a predetermined rule. Good.
- the UI 150 is realized by, for example, a display unit (output device) such as a display that displays an image, and an operation unit (input device) such as a touch panel or a mouse that acquires a user operation on the displayed image.
- the UI 150 displays an image according to the control of the control unit 140 and provides the control unit 140 with information indicating a user operation on the displayed image.
- the output device included in the UI 150 may further include an audio output unit such as a speaker.
- the server 200 includes an application server 210 and a DB (database) 220 as functional configurations.
- the application server 210 is realized in software by a processor such as a CPU operating according to a program.
- the application server 210 executes various calculations for providing an application service to the terminal device 100 while referring to the DB 220.
- the application server 210 accumulates information indicating the user's behavior received from the terminal device 100 in the DB 220, and transmits the accumulated information to the terminal device 100 as necessary.
- information indicating each user's action recognized in the terminal device 100 used by each of a plurality of users is collected in the DB 220.
- the application server 210 reads information indicating another user's behavior from the DB 220 and transmits it to the terminal device 100, so that the behavior information can be shared among the users.
- the application server 210 acquires information (information different from behavior information) used for the configuration of the virtual space in the control unit 140 of the terminal device 100 and transmits the information to the terminal device 100.
- the application server 210 may collect profile information (information indicating user attributes) registered by each user in the DB 220 and transmit the profile information to the terminal device 100 as necessary.
- the application server 210 may execute statistical processing of user behavior information accumulated in the DB 220 and transmit information such as user behavior patterns obtained by the processing to the terminal device 100.
- the DB 220 is realized by a storage device, for example, and stores various types of information processed by the application server 210.
- the DB 220 stores the behavior information of each user received by the application server 210 from the terminal device 100.
- the DB 220 may store information indicating the attributes of each user acquired by the application server 210 and information generated by the application server 210 performing statistical processing on the behavior information of each user.
- the communication unit 130 functions as a first information acquisition unit that acquires user behavior information (internally acquires the user's own behavior information of the terminal device 100).
- the action recognition unit 120 also functions as a first information acquisition unit).
- the communication unit 130 also functions as a second information acquisition unit that acquires information different from the behavior information related to the user (for example, information obtained by statistically processing attribute information or behavior information).
- the control unit 140 displays a user object that is configured based on behavior information and represents each user, and a virtual space that is configured based on information different from the behavior information and in which the user object is arranged in a display unit included in the UI 150. It functions as a display control unit for displaying. Therefore, it can be said that the terminal device 100 is an information processing device including these components.
- the terminal device 100 and the server 200 may further include functional configurations other than those illustrated.
- the terminal device 100 may further include a database, and may hold a cache of information stored in the DB 220 of the server 200.
- behavior information used for displaying an image including a user object on the terminal device 100 for example, behavior information of a friend of the user of the terminal device 100 is held inside the terminal device 100, and an image including the user object is stored. It is not necessary to receive action information from the server 200 each time it is displayed.
- the behavior information stored in the database of the terminal device 100 can be differentially updated by the application server 210 of the server 200 with reference to the behavior information stored in the DB 220.
- the terminal device that provides user behavior information to the server is the same as the terminal device that displays an image including a user object based on information such as behavior information aggregated by the server.
- the function of the terminal device 100 includes the sensor 110 and the behavior recognition unit 120, acquires sensor data, recognizes the user's behavior, and transmits the behavior information to the server 200, and the communication unit 130.
- the control unit 140 and the UI 150 may be distributed to the display terminal 100b that displays an image including a user object based on information received from the server 200.
- the display terminal 100b is not necessarily a terminal device carried by the user, and may be a stationary PC, a television, a game machine, or the like.
- This embodiment is advantageous from the viewpoint of protecting privacy of behavior information because behavior recognition is executed by the terminal device 300 and user behavior information is generated. For example, it may be possible for the user to select whether or not to transmit behavior information obtained as a result of behavior recognition to the server. Accordingly, the user can keep the behavior information that he / she does not want to share within the terminal device 100 and not transmit it to the server 200.
- FIG. 2 is a block diagram schematically illustrating a second embodiment of the present disclosure.
- the system 30 includes a terminal device 300 and a server 400.
- the terminal device 300 is a terminal device carried by a user, such as a smartphone, a tablet terminal, a portable game machine, or a portable media player, and can be realized by, for example, a hardware configuration of an information processing device described later.
- the server 400 communicates with one or a plurality of terminal devices 300 via various wired or wireless networks, and provides various services to the terminal devices 300.
- the server 400 is realized by a single server device or a plurality of server devices that are connected to and cooperate with each other via a network.
- the server device can be realized by, for example, a hardware configuration of an information processing device described later.
- the terminal device 300 includes the sensor 110, the communication unit 130, the control unit 140, and the UI 150 as functional configurations
- the server 400 includes the behavior recognition unit 120, the application server 210, and the DB 220 as functional configurations.
- the action recognition unit 120 is included in the server 400 instead of the terminal device 300. Accordingly, each terminal device 300 transmits the detection result of the sensor 110 to the server 400, and the server 400 recognizes the action of each user based on the detection result of the sensor 110 received from each terminal device 300.
- a terminal device that provides user sensing data to a server and a terminal device that displays an image including a user object based on information such as behavior information aggregated by the server are independent of each other.
- the functions of the terminal device 300 include the sensor 110, the sensor log terminal 300a that acquires sensing data and provides the server 400, the communication unit 130, the control unit 140, and the UI 150, and information received from the server 400.
- the display terminal 300b that displays an image including a user object.
- the display terminal 300b is not necessarily a terminal device carried by the user, and may be a stationary PC, a television, a game machine, or the like.
- the communication unit 130 acquires the first information acquisition unit that acquires the user's behavior information, and the second information acquisition that acquires information different from the behavior information regarding the user. It functions as a part.
- the control unit 140 displays a user object that is configured based on behavior information and represents each user, and a virtual space that is configured based on information different from the behavior information and in which the user object is arranged in a display unit included in the UI 150. It functions as a display control unit for displaying. Therefore, it can be said that the terminal device 300 is an information processing device including these components.
- This embodiment is advantageous in that the power consumption of the terminal device 300 can be reduced because the terminal device 300 does not have to execute an operation for action recognition, compared to the first embodiment. Further, when the sensor log terminal 300a is independent as in the modified example, a processing circuit such as a processor for action recognition is not required, so that the sensor log terminal 300a can be reduced in size and weight, and power consumption can be reduced. it can.
- FIG. 3 is a block diagram schematically illustrating a third embodiment of the present disclosure.
- the system 50 includes a terminal device 500 and a server 600.
- the terminal device 500 is a terminal device carried by a user, such as a smartphone, a tablet terminal, a portable game machine, or a portable media player, and can be realized by, for example, a hardware configuration of an information processing device described later.
- the server 600 communicates with one or a plurality of terminal devices 500 via various wired or wireless networks, and provides various services to the terminal devices 500.
- the server 600 is realized by a single server device or a plurality of server devices that are connected to and cooperate with each other via a network.
- the server device can be realized by, for example, a hardware configuration of an information processing device described later.
- the terminal device 500 includes the sensor 110, the communication unit 530, and the UI 150 as functional configurations
- the server 600 includes the behavior recognition unit 120, the control unit 140, the application server 210, and the DB 220 as functional configurations. Include as.
- the action recognition unit 120 not only the action recognition unit 120 but also the control unit 140 is included in the server 600. Accordingly, each terminal device 500 transmits the detection result of the sensor 110 to the server 600, receives image data including the user object generated by the server 600, and causes the UI 150 to display the image data.
- the communication unit 530 is realized by various wired or wireless communication devices, for example.
- the communication unit 530 receives data of an image including a user object including a user object from the server 600.
- Image data including the user object is generated by the control unit 140 included in the server 600 based on the behavior information and other information.
- the communication unit 530 acquires information indicating the user's operation on the image including the user object from the UI 150, and transmits the information to the server 600 so that the control unit 140 can change the image including the user object based on the operation. May be sent to.
- a terminal device that provides user sensing data to a server and a terminal device that receives and displays image data including a user object from the server can be configured independently of each other.
- the function of the terminal device 500 described above includes the sensor 110, the sensor log terminal 500 a that acquires sensing data and provides the server 600, the communication unit 530, and the UI 150, and the user uses the data received from the server 600. It may be realized by being distributed to the display terminal 500b that displays an image including an object.
- the display terminal 500b is not necessarily a terminal device carried by the user, and may be a stationary PC, a television, a game machine, or the like.
- the behavior recognition unit 120 and the application server 210 function as a first information acquisition unit that acquires user behavior information.
- the application server 210 also functions as a second information acquisition unit that acquires information different from the behavior information about the user by reading separately provided attribute information from the DB 220 or statistically processing the behavior information.
- the control unit 140 includes a user object configured based on behavior information and representing each user, and a virtual space configured based on information different from the behavior information and in which the user object is arranged. It functions as a display control unit to be displayed on the display unit included in the UI 150 of the terminal device 500. Therefore, it can be said that the server 600 is an information processing apparatus including these components.
- this embodiment does not have to execute an operation for generating an image including a user object. Most advantageous from the point of view.
- the server 600 grasps not only the user behavior information but also the operation state in the UI 150. Therefore, the above two embodiments may be more advantageous from the viewpoint of privacy.
- behavior information sharing in some embodiments of the present disclosure will be described with reference to FIGS. 4 and 5.
- the action information of each user is shared via the user object displayed by the control unit 140.
- sharing of such behavior information will be described more specifically, including examples of data to be referred to and displayed screens.
- “behavior status” is information indicating user behavior classified according to a predetermined standard. For example, in the following description, “being rested”, “running”, “moving in a car”, etc. are exemplified as the action status, but various other action statuses can be set. Also, the granularity of the action status can be changed depending on the type of the virtual space 1013, for example. For example, the action statuses “moving by car”, “moving by bus”, and “moving by train” may be integrated and displayed in one action status “moving” in some cases.
- “behavior log” is information indicating a history of past actions of the user. In some examples, the behavior log may be a history of behavior status.
- FIG. 4 is a diagram for describing sharing of behavior information according to some embodiments of the present disclosure.
- the user logs in to social media 1003 using a client application 1001 provided in the terminal device.
- Social media 1003 provides a sharing list 1005 for logged-in users.
- the sharing list 1005 is provided from the server that provides the social media 1003 to the control unit 140 in the embodiment described above, for example.
- the sharing list is provided from social media.
- the sharing list may be provided by other than social media. For example, it may be possible to create a shared list between friends by directly exchanging IDs with other users who are friends.
- the client application 1001 controls the behavior recognition unit 1007 of the terminal device or the server to provide the behavior information DB 1009 with the behavior information of the user.
- the behavior information of each user is collected in the behavior information DB 1009.
- the control unit 140 When a user shares behavior information with other users, for example, in the embodiment described above, the control unit 140 first displays a list 1011 of other users that can share behavior information according to the sharing list 1005 as a UI 150. For the user selected from the list 1011, the user object 1015 indicating the action status of the user may be displayed in the virtual space 1013. A specific display example of the virtual space 1013 and the user object 1015 will be described later.
- the control unit 140 acquires the user behavior information from the behavior information DB 1009. At this time, it may be determined with reference to the sharing list 1005 whether or not the user's behavior information can be provided. Further, a plurality of virtual spaces 1013 may be set for each group into which each user is classified, and each virtual space 1013 may be switched and displayed. In the illustrated example, three virtual spaces 1013a, 1013b, and 1013c are set. Between the virtual spaces 1013a, 1013b, and 1013c, for example, the granularity of action statuses set based on action information may be different.
- the action log display 1017 of another user can be referred to.
- the action log display 1017 is, for example, a display of a user's action log expressed by a graph indicating a ratio of actions taken by the target user on the day, a past action pattern of the user, or the like.
- the behavior log display 1017 can be generated based on the behavior log 1019 of the target user acquired from the behavior information DB 1009.
- FIG. 5 is a diagram showing a display example for sharing the behavior information shown in FIG.
- the list 1011 is a list having information 1021 of each user as an element.
- the information 1021 of each user for example, in addition to displaying thumbnail images, user names, social media IDs, action statuses, and the like as texts, user objects 1023 indicating action statuses may be displayed.
- the user object 1023 may be common to the user object 1015 displayed in the virtual space 1013.
- the action status is not specified, so the action status text or the user object 1023 is displayed in the information 1021 of each user. It may not be done.
- the virtual space 1013 is a space in which user objects 1015 for displaying each user are arranged.
- the user object 1015 indicates the action status of each user by its pose, display position, and the like. Therefore, as with the user object 1023 in the list 1011, the user object 1015 may not be displayed when the user action status is not specified. Further, as will be described later, the virtual space 1013 in which the user object 1015 is arranged changes depending on, for example, the attribute or behavior pattern of each user.
- an action log display 1017 for the user can be displayed. It is.
- an action statistics display 1017a and a comment timeline display 1017b are shown.
- the behavior statistics display 1017a is a display showing a result of statistical analysis of the behavior log of the target user. For example, a graph showing the ratio of the daily behavior status of the user, a specific behavior (time, number of steps, A ranking display that compares the amount of time spent on the train, etc.) with other users may be included.
- the comment timeline display 1017b displays a comment posted by the target user in a timeline and also displays a user's action status at the time of posting each comment.
- the user of the terminal device is displayed as the information 1021 or the user object 1015.
- the action log display 1017 for the user himself / herself.
- the action log display 1017 for other users items that each user has permitted to publish can be displayed in a limited manner. In other words, for example, an action log set private by each user depending on the type of action or time zone is not reflected in the action log display 1017.
- the range of the action log displayed as the action log display 1017 and the action log display method in the action log display 1017 may be changed according to the current action status of the target user. For example, if the user is currently moving, only action logs related to movement such as walking, trains, and buses may be reflected in the action log display 1017. Alternatively, if the user is currently working in the office, only the action log in the office may be reflected in the action log display 1017.
- Action status display in some embodiments of the present disclosure will be described with reference to FIGS. 6 and 7.
- the user object 1015 indicating the behavior status of each user is displayed in the virtual space 1013, whereby the behavior status of each user is shared.
- the display of the action status will be described more specifically.
- FIG. 6 is a diagram for describing action status display according to some embodiments of the present disclosure.
- user objects 1015 a to 1015 g are displayed in the virtual space 1013.
- each user object 1015 may represent the action status of the corresponding user by its pose (shape) or movement.
- the user object 1015a is displayed in a sitting pose and indicates that the user's action status is “resting”.
- the user objects 1015c and 1015e are displayed in a running pose, and further indicate that the user's action status is “running” by actually running around.
- the user object 1015f is displayed in a pose with a book to indicate that the user's action status is “Reading”.
- the user object 1015g is displayed in a paused pose, indicating that the user's action status is “still”.
- the user object 1015 may represent the corresponding user's action status by another object (container object) displayed as a container.
- the user object 1015b is displayed in a container object 1016a of a car running on a road, thereby indicating that the user's action status is “moving in a car”.
- the user object 1015d is displayed in the container object 1016b of the bus running on the road, thereby indicating that the user's action status is “moving on the bus”.
- the container object 1016 arranged in the virtual space 1013 is displayed based on, for example, a user behavior pattern displayed as the user object 1015 in the virtual space 1013, and there is no user corresponding to the corresponding behavior status. Can also be displayed.
- a bench container object 1016c is displayed to indicate that the user is sitting, but a user object sitting on the bench is not displayed.
- the number and size of other objects displayed in association with the user object 1015 may change according to the number of user objects 1015 associated with the object. For example, in the illustrated example, when the number of user objects 1015 indicating users in a car increases, another car may be additionally displayed. In the illustrated example, when the number of user objects 1015 indicating users on the bus increases, the size of the bus may increase.
- the container object 1016 may be used as an operation element for selecting a user having a corresponding type of action status by selecting it. For example, as shown in FIG. 7, when the container object 1016b of the bus displayed in the virtual space 1013 is selected to show that the user is on the bus in the example of FIG. Selected at once.
- the users selected in a lump can be displayed as a list 1011 as shown in the figure or set as a message transmission destination, for example.
- a similar operation may be possible for, for example, a car or bench container object 1016 displayed in the virtual space 1013.
- the arrangement of the user objects 1015 in the virtual space 1013 is determined according to the action status expressed by each user object 1015.
- user objects 1015b and 1015d corresponding to a user moving on a car or a bus are arranged on a road drawn in the virtual space 1013.
- the user object 1015a corresponding to the user taking a break is placed in a cafe drawn in the virtual space 1013.
- the arrangement of each user object can be determined regardless of the actual position information of the user, for example. Therefore, for example, a plurality of users on a bus at different places can be represented as a plurality of user objects 1015 on the same bus.
- the user object 1015c is arranged in the park in front and the user object 1015e is arranged in the back crosswalk.
- This arrangement may indicate that the user corresponding to the user object 1015c is running on a place other than a road such as a park, and the user corresponding to the user object 1015e is running on a road or a sidewalk.
- the arrangement of the user objects 1015 may reflect not only the action status expressed by each user object 1015 but also the position information of the user. In the illustrated example, the type of the user's position is reflected in the arrangement of the user object 1015.
- a user running on a road is placed on the same pedestrian crossing as the user object 1015e wherever the road is, and a user running on a place other than the road is a user object 1015c regardless of whether it is a park or not. Can be placed in the same park.
- a user object 1015 indicating a user on an elevator may be placed in a building displayed in the virtual space 1013.
- the number and size of background objects such as roads, cafes, parks, pedestrian crossings, and buildings arranged in the virtual space 1013 change according to the number of user objects 1015 arranged there. Also good. For example, in the illustrated example, when the number of user objects 1015 indicating a user who is moving on a car or a bus exceeds a predetermined number, the width of the road displayed in the virtual space 1013 may be widened. Further, for example, when the number of user objects 1015 indicating users riding on an elevator reaches a predetermined number or more, one building displayed in the virtual space 1013 may be added.
- FIG. 8 is a diagram for describing a first example of setting a virtual space in some embodiments of the present disclosure.
- a virtual space (“Bookstore” town view) 1013p that imitates a bookstore is set as a virtual space 1013 in which user objects 1015 indicating users classified into group A are arranged.
- “Book Score” (set from 0 to 100) is set for each user classified into group A in the profile information 1025a separately provided by each user based on the purchase history of the book. It is based on sharing the characteristic of being high.
- FIG. 9 is a diagram for describing a second example of setting a virtual space in some embodiments of the present disclosure.
- a virtual space (“young & student” town view) 1013q that imitates a school road is set as a virtual space 1013 in which user objects 1015 indicating users classified into group B are arranged.
- user objects 1015 indicating users classified into group B are arranged.
- each user classified into group B is a student in the late teens to early 20s, and the occupation is university or high school students. It is based on being.
- FIG. 10 is a diagram for describing a third example of setting a virtual space in some embodiments of the present disclosure.
- the virtual space 1013 displayed as the user object 1015 indicating the users classified into the group C the virtual space (“Bookstore” town view) 1013p imitating the bookstore is the same as in the first example. Is set. This may be based on the fact that the average behavior pattern extracted based on the behavior log shares the characteristic that each user has a relatively high percentage of behavior “reading”.
- the virtual space 1013 that is the background of the user object 1015 in the image including the user object is the user's behavior used for the configuration of the user object 1015. It is configured based on information (second information) different from information (first information).
- the virtual space 1013 can be configured based on a result obtained by combining the second information related to each user (for example, various clustering results including an average and an intermediate value).
- the user object 1015 representing each user can be arranged in the common virtual space 1013.
- the image displayed as the virtual space 1013 may be selected from, for example, preset images, or newly generated by converting numerical values extracted from information about each user according to a predetermined rule. May be.
- the virtual space 1013 is configured based on the profile information 1025 of each user.
- the profile information 1025 is an example of information indicating user attributes, and is generated based on, for example, the user's age, occupation, or product purchase history.
- the profile information may be generated based on information other than age, occupation, and product purchase history, and the virtual space 1013 may be configured based on information other than profile information.
- the virtual space 1013 is configured based on information indicating the behavior pattern of each user.
- Information indicating a behavior pattern is an example of information obtained by statistically processing user behavior information.
- the user behavior pattern is extracted by, for example, the application server 210 described in the first to third embodiments performing statistical processing on the user behavior information accumulated in the DB 220.
- the process for extracting the behavior pattern can use a known technique as described in, for example, Japanese Patent Application Laid-Open No. 2011-81431, and the detailed description thereof is omitted here. .
- a part or all of the virtual space 1013 set for the group C may be temporally changed to reflect an action pattern common to users classified into the group C.
- the normal virtual space 1013 may be changed to a virtual space 1013p imitating a bookstore only between 21:30 and 23:00.
- a display change may be added so that each user object 1015 possesses a book object only between 21:30 and 23:00.
- FIG. 11 is a diagram for describing an example of the operation of the virtual space according to some embodiments of the present disclosure.
- user objects 1015p and 1015q indicating the users P and Q, respectively are displayed in the virtual space 1013.
- the user P has an action pattern of getting on a bus at 8:00 every day.
- the user Q has an action pattern of getting on a bus every day at 9:00.
- a bus 1027a and a bus 1027b are displayed at 8:00 and 9:00, respectively.
- the bus 1027 can be displayed as the container object 1016 in the example of FIG.
- the user object 1015p that displays the user P is on the bus 1027a.
- the user object 1015q that displays the user Q is on the bus 1027b.
- the bus 1027a and the bus 1027b displayed in the virtual space 1013 may be displayed regardless of whether the users P and Q are on the bus, respectively. More specifically, in the virtual space 1013, the bus 1027a may be operated at 8:00 every day, and the bus 1027b may be operated at 9:00. In this case, for example, when the user P does not get on 8:00 due to a vacation or the like, the user object 1015p displaying the user P is displayed in a different pose at a different place depending on the action status of the user P at that time. On the other hand, the bus 1027a operates as usual at 8:00. However, since the user P is not on the bus, no user object is on the bus 1027a.
- the user's temporary event information is displayed.
- the feature of the user's behavior can be recognized from the display of the virtual space 1013 without being affected by the regular behavior.
- a user may be encouraged to work on another user.
- an action pattern of the same time zone ( The behavior may be the same or different).
- other users may prompt the user to take the behavior.
- the user object that displays the user X in the virtual space 1013 displayed on the terminal device of the user Y 1015 may prompt the user Y to ride a bicycle while jogging.
- FIG. 12A is a diagram for describing a first example of an operation on an object according to some embodiments of the present disclosure.
- the user's action status is “still” by moving the user object 1015 displayed in a stationary state to the car 1027c displayed in the virtual space 1013 by an operation such as drag and drop. Will be corrected to "Move in a car".
- the user object 1015 displayed in the car 1027c indicates that the user is moving in a car.
- the object corresponding to the action status of the user (in this case, the car 1027c) is displayed in the virtual space, whereby the action status correction operation can be executed by changing the arrangement of the user object 1015.
- the result of such a correction operation may be used for learning in action recognition, for example.
- FIG. 12B is a diagram for describing a second example of an operation on an object according to some embodiments of the present disclosure.
- the door 1027d appears by long pressing the user object 1015.
- an operation such as drag and drop
- the user displayed by the user object 1015 is deleted from the user group corresponding to the currently displayed virtual space 1013.
- a predetermined operation here, long press
- the user object 1015 or an operation corresponding to the user object 1015 may be executable.
- FIGS. 13A and 13B are diagrams for describing a third example of an operation on an object in some embodiments of the present disclosure.
- a dialog box 1029 appears when the contact is released after the user object 1015 is pressed long (without moving to an operation such as dragging).
- the dialog box 1029 displays operations that can be executed for the user according to the action status of the user corresponding to the user object 1015 at that time.
- each operation displayed in the dialog box 1029a For example, in the case of the stationary user object 1015 shown in FIG. 13A, “message”, “vibration”, and “voice” are displayed in the dialog box 1029a.
- each operation for the target user is executed. For example, when “message” is selected, a transition is made to a message transmission screen for the target user.
- a vibration notification for the target user is executed.
- voice When “voice” is selected, a voice call is prepared with the target user.
- different types of actions for the user displayed by the user object 1015 may be performed depending on the type of operation for selecting the user object 1015. For example, when the user object 1015 is touched, the display transitions to the log display 1017. When the user object 1015 is double-touched, the message transmission screen is displayed. When the user object 1015 is flicked to the right, a voice call is prepared. When the terminal device is shaken while tapping the user object 1015, a vibration notification is executed to the target user.
- the type of action for the user indicated by the user object 1015 may be automatically selected. For example, when a predetermined operation (for example, a double tap) is performed on the user object 1015, if the user indicated by the user object 1015 is the user himself / herself, posting to social media is selected. . If the user indicated by the user object 1015 is a friend, message transmission to the user is selected. In addition, when sending a message, if the target user is sitting, it is estimated that the user can browse the message over time, so message sending with an image may be selected. . Further, if the target user is moving in a car, a vibration notification may be selected so as not to hinder driving.
- a predetermined operation for example, a double tap
- a user object indicating a user suitable for communication may be highlighted on a screen that displays the action status of each user using the user object 1015 as shown in FIG.
- the user object 1015 of the user who indicates that the terminal is being operated by the behavior information is displayed blinking, or a mark is displayed on the user object of the user who has posted on social media within the past 5 minutes. Or you may.
- FIG. 14 is a diagram illustrating an overview of a user grouping operation according to some embodiments of the present disclosure.
- the user object 1023 indicating the action status of each user can be displayed in the list 1011 that displays users.
- users that can be grouped are displayed in the list 1011, and long-pressing the user object 1023 of any of these users, and further flicking the screen to the right or left.
- the screen is switched to the virtual space 1013 while the user object 1023 that has been pressed is left, and the user object 1015 corresponding to this user is added to the virtual space 1013 by dropping the user object in the virtual space 1013.
- the user is classified into a group corresponding to the virtual space 1013.
- the user object 1031 displayed during the transition of the screen may be the same as the user object 1023, the same as the user object 1015, or may be different from either. Further, after the screen is switched to the virtual space 1013, by further flicking the screen to the right or left, another virtual space 1013 is displayed, and the selected user can be further classified into another group. There may be.
- the virtual space 1013 displayed on the UI 150 can be changed for each group into which users are classified. Further, for each virtual space 1013, the criteria for classifying the user's behavior and setting the behavior status may be changed. For example, when the virtual space 1013 corresponding to a certain group is a virtual space imitating a bookstore (“Bookstore” town view), the action statuses related to books are classified more finely than other virtual spaces, and the other action statuses are coarser. (For example, traveling by train, bus, or car is summarized as “moving”).
- FIG. 15 is a diagram showing a first example of recommended group display in the user grouping operation shown in FIG.
- it is determined that it is appropriate that a certain user is classified into a group corresponding to a virtual space (“Bookstore” town view) imitating a bookstore based on the profile information 1025.
- text 1033 (“Bookstore”) indicating a recommended group or virtual space may be displayed in the user information 1021 displayed in the list 1011.
- the user object 1023 corresponding to this user may be changed to the object 1035 corresponding to this group or virtual space.
- the virtual space 1013 is configured based on user profile information 1025 classified into groups, action patterns, or the like. Therefore, when a user object 1015 representing a user different from the users already classified into the group is added to the virtual space 1013, the features appearing in the profile information 1025 or the action pattern may change.
- FIG. 16 is a flowchart showing the processing of the example shown in FIG.
- the following process may be, for example, the process of the control unit 140 illustrated in FIGS.
- selection and registration of a group member is accepted (step S101).
- the registration here can be, for example, initial registration of group members. In this case, the characteristics of group members and the displayed virtual space 1013 are not yet set for each group.
- the profile of the registered member is acquired (step S103).
- the profile acquired here is information such as the profile information 1025 shown in the examples of FIGS. 8 and 9, for example, and may be registered by each user separately from the behavior information.
- common items of the profile are extracted (step S105).
- features such as a relatively high score value, low score, or medium score are extracted as common items among users. Whether the score is high or low may be determined based on, for example, a simple average value, or may be determined in consideration of variance.
- a group town view is selected (step S107).
- the town view here is displayed as the virtual space 1013 in the above description, and is, for example, the “Bookstore” town view and the “young & student” town view shown in the examples of FIGS.
- the setting of the virtual space 1013 corresponding to the group into which the member selected and registered in step S101 is classified is completed.
- a recommended group is displayed (step S 109).
- the recommended group can be displayed by, for example, the method shown in FIG.
- the display is not limited to automatically displayed, for example, the text 1033 or the object 1035.
- the display may change according to whether the display is changed when the user object 1023 is moved to the virtual space 1013.
- the display may be changed corresponding to the operation.
- step S111 additional selection and additional registration of group members are accepted.
- the additional selection and additional registration accepted here can be executed by the user who refers to the display of the recommended group in step S109. However, the user does not necessarily perform additional selection and additional registration according to the display of the recommended group, and may perform additional selection and additional registration ignoring this.
- step S111 the same processing as steps S103 to S107 is executed again, and an appropriate town view can be selected for the group after additional selection and additional registration. If there is still a user who is not registered in the group, the recommended group display similar to step S109 can be executed again.
- FIG. 17 is a diagram showing a second example of recommended group display in the user grouping operation shown in FIG.
- a certain user is classified into a group corresponding to a virtual space imitating a bookstore (“Bookstore” town view) based on behavior patterns obtained from past behavior logs. It has been determined.
- text 1033 (“Bookstore”) indicating a recommended group or virtual space may be displayed in the user information 1021 displayed in the list 1011.
- the user object 1023 corresponding to this user may be changed to an object 1037 indicating the user's action corresponding to this group or virtual space.
- the display of the virtual space 1013 does not change, so that the recommended group is expressed. May be.
- FIG. 18 is a flowchart showing the processing of the example shown in FIG.
- the following process may be, for example, the process of the control unit 140 illustrated in FIGS.
- a group into which users are classified and a town view corresponding to the group are set by the same processing as steps S101 to S107 shown in FIG. 16 (step S201).
- a group member's action pattern is acquired (step S203).
- a behavior pattern generated in advance based on the behavior log of each member may be acquired, or at this time, a new behavior pattern may be generated from each member behavior log.
- step S205 behavior having high correlation among group members is extracted (step S205).
- the “behavior with high correlation” mentioned here is, for example, “the same action in the same time zone” (for example, reading at the same time zone every night as in the example of FIG. 10), “the same day ( The same behavior on weekdays / holidays ”(for example, a user who travels by car on weekdays for a long time is estimated to use a car at work, and a user who travels by car on a holiday has a long time It is estimated that a car is used at leisure), “same behavior of the same length” (for example, even a user who runs every day may have different levels in 15 minutes and 1 hour), etc. It may be extracted based on common criteria, or simply by including common behaviors in daily life patterns.
- the town view is updated according to the behavior extracted in step S205 (step S207).
- the updating of the town view here may be, for example, replacing the town view with a completely new one.
- an object arranged in the virtual space 1013 is added like the bus 1027 shown in the example of FIG. Or it may be changed.
- recommended groups are displayed for unregistered users (step S109), and additional selection and additional registration of group members are accepted (step S111).
- FIG. 19 is a block diagram for explaining the hardware configuration of the information processing apparatus.
- the illustrated information processing apparatus 900 can realize, for example, the terminal device or server in the above-described embodiment.
- the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
- the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
- the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary.
- the information processing apparatus 900 may include a processing circuit such as a DSP (Digital Signal Processor) instead of or in addition to the CPU 901.
- DSP Digital Signal Processor
- the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
- the ROM 903 stores programs and calculation parameters used by the CPU 901.
- the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- PCI Peripheral Component Interconnect / Interface
- the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
- the input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900.
- the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
- the output device 917 is a device that can notify the user of the acquired information visually or audibly.
- the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Electro-Luminescence) display, an audio output device such as a speaker and headphones, and a printer device.
- the output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or an image, or outputs it as audio such as voice or sound.
- the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
- the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
- the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
- the drive 921 writes a record in the attached removable recording medium 927.
- the connection port 923 is a port for directly connecting a device to the information processing apparatus 900.
- the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
- the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (High-Definition Multimedia Interface) port, or the like.
- the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
- the communication device 925 may be, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
- the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
- the communication network 931 connected to the communication device 925 is a wired or wireless network, such as the Internet, a home LAN, infrared communication, radio wave communication, or satellite communication.
- the imaging device 933 uses various members such as an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
- the imaging device 933 may capture a still image or may capture a moving image.
- the sensor 935 is various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, and an atmospheric pressure sensor.
- the sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do.
- the sensor 935 may include a GPS sensor that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the apparatus.
- GPS Global Positioning System
- Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
- Embodiments of the present disclosure include, for example, an information processing device (terminal device or server) as described above, a system, an information processing method executed by the information processing device or system, a program for causing the information processing device to function, And a non-transitory tangible medium on which the program is recorded.
- a first information acquisition unit that acquires first information indicating one or more user actions
- a second information acquisition unit that acquires second information different from the first information related to the one or more users, and a user configured based on the first information and representing each of the one or more users
- An information processing apparatus comprising: a display control unit configured to display an object and a virtual space configured based on the second information and in which the user object is arranged.
- the display control unit combines the second information regarding each of the one or more users to set the virtual space common to the user objects respectively representing the one or more users.
- the display control unit configures the virtual space for each group into which the one or more users are classified, and arranges the user object in the virtual space corresponding to the group into which each user is classified.
- the information processing apparatus according to any one of (1) to (7).
- the display control unit according to (8) wherein a text or an image indicating a group recommended to classify the user is displayed on the display unit for a user not classified into the group.
- Information processing device. (10)
- the display control unit corresponds to the virtual space by displaying, as a preview, a change in the virtual space when the user is placed in the virtual space for a user not classified into the group.
- the terminal device and the one or more server devices cooperate, A function of acquiring first information indicating an action of one or a plurality of users;
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Psychiatry (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
1.実施形態の例
1-1.第1の実施形態
1-2.第2の実施形態
1-3.第3の実施形態
2.行動情報の共有
3.行動ステータス表示
4.仮想空間の設定
5.オブジェクトに対する操作
6.ユーザのグルーピング
7.ハードウェア構成
8.補足
まず、図1~3を参照して、本開示の実施形態の例について説明する。
図1は、本開示の第1の実施形態を概略的に示すブロック図である。図1を参照すると、本実施形態に係るシステム10は、端末装置100と、サーバ200とを含む。端末装置100は、例えばスマートフォン、タブレット端末、携帯型ゲーム機、携帯型メディアプレーヤなど、ユーザによって携帯される端末装置であり、例えば後述する情報処理装置のハードウェア構成によって実現されうる。また、サーバ200は、有線または無線の各種ネットワークを介して1または複数の端末装置100と通信し、端末装置100に各種のサービスを提供する。サーバ200は、単一のサーバ装置またはネットワークで互いに接続されて協働する複数のサーバ装置によって実現される。サーバ装置は、例えば後述する情報処理装置のハードウェア構成によって実現されうる。
端末装置100は、機能構成として、センサ110、行動認識部120、通信部130、制御部140およびUI(ユーザインターフェース)150を含む。
サーバ200は、機能構成として、アプリケーションサーバ210と、DB(データベース)220とを含む。
図2は、本開示の第2の実施形態を概略的に示すブロック図である。図2を参照すると、本実施形態に係るシステム30は、端末装置300と、サーバ400とを含む。端末装置300は、例えばスマートフォン、タブレット端末、携帯型ゲーム機、携帯型メディアプレーヤなど、ユーザによって携帯される端末装置であり、例えば後述する情報処理装置のハードウェア構成によって実現されうる。また、サーバ400は、有線または無線の各種ネットワークを介して1または複数の端末装置300と通信し、端末装置300に各種のサービスを提供する。サーバ400は、単一のサーバ装置またはネットワークで互いに接続されて協働する複数のサーバ装置によって実現される。サーバ装置は、例えば後述する情報処理装置のハードウェア構成によって実現されうる。
図3は、本開示の第3の実施形態を概略的に示すブロック図である。図3を参照すると、本実施形態に係るシステム50は、端末装置500と、サーバ600とを含む。端末装置500は、例えばスマートフォン、タブレット端末、携帯型ゲーム機、携帯型メディアプレーヤなど、ユーザによって携帯される端末装置であり、例えば後述する情報処理装置のハードウェア構成によって実現されうる。また、サーバ600は、有線または無線の各種ネットワークを介して1または複数の端末装置500と通信し、端末装置500に各種のサービスを提供する。サーバ600は、単一のサーバ装置またはネットワークで互いに接続されて協働する複数のサーバ装置によって実現される。サーバ装置は、例えば後述する情報処理装置のハードウェア構成によって実現されうる。
次に、図4および図5を参照して、本開示のいくつかの実施形態における行動情報の共有について説明する。例えば上述した各実施形態では、制御部140が表示させるユーザオブジェクトを介して、各ユーザの行動情報が共有される。以下では、そうした行動情報の共有について、参照されるデータや表示される画面などの例を含めてさらに具体的に説明する。
次に、図6および図7を参照して、本開示のいくつかの実施形態における行動ステータス表示について説明する。上述した行動情報の共有では、各ユーザの行動ステータスを示すユーザオブジェクト1015が仮想空間1013に表示されることによって、各ユーザの行動ステータスが共有される。以下では、かかる行動ステータスの表示について、さらに具体的に説明する。
図示された例のように、それぞれのユーザオブジェクト1015は、そのポーズ(形状)または動きによって、対応するユーザの行動ステータスを表現してもよい。例えば、ユーザオブジェクト1015aは、座っているポーズで表示され、ユーザの行動ステータスが「休憩中」であることを示す。ユーザオブジェクト1015c,1015eは、走っているポーズで表示され、さらに実際に走り回ることによってユーザの行動ステータスが「走っている」であることを示す。ユーザオブジェクト1015fは、本を持ったポーズで表示されることによってユーザの行動ステータスが「読書中」であることを示す。ユーザオブジェクト1015gは、立ち止まったポーズで表示されることによってユーザの行動ステータスが「静止している」であることを示す。
また、ユーザオブジェクト1015は、その入れ物として表示される他のオブジェクト(コンテナオブジェクト)によって、対応するユーザの行動ステータスを表現してもよい。例えば、ユーザオブジェクト1015bは、道路を走る自動車のコンテナオブジェクト1016a内に表示されることによってユーザの行動ステータス「自動車で移動中」であることを示す。ユーザオブジェクト1015dは、道路を走るバスのコンテナオブジェクト1016b内に表示されることによってユーザの行動ステータスが「バスで移動中」であることを示す。
また、仮想空間1013におけるユーザオブジェクト1015の配置は、それぞれのユーザオブジェクト1015によって表現される行動ステータスに応じて決定される。図示された例では、自動車やバスに乗って移動中のユーザに対応するユーザオブジェクト1015b,1015dが、仮想空間1013に描かれた道路上に配置される。また、休憩しているユーザに対応するユーザオブジェクト1015aは、仮想空間1013に描かれたカフェに配置される。図示された例において、各ユーザオブジェクトの配置は、例えばユーザの実際の位置情報とは関係なく決定されうる。従って、例えば相異なる場所でバスに乗っている複数のユーザが、同じバスに乗っている複数のユーザオブジェクト1015として表現されうる。
次に、図8~図11を参照して、本開示のいくつかの実施形態における仮想空間の設定について説明する。上述した行動情報の共有では、各ユーザの行動ステータスが反映されたユーザオブジェクト1015が仮想空間1013に表示されることによって、各ユーザの行動ステータスが共有される。以下では、このときの仮想空間1013の設定について、さらに具体的に説明する。
次に、図12および図13を参照して、本開示のいくつかの実施形態におけるオブジェクトに対する操作について説明する。上述した行動情報の共有では、仮想空間1013に表示されるユーザオブジェクト1015に対してユーザインターフェース上で操作を加えることによって、各ユーザの情報または現実の各ユーザに対する操作を実行することが可能である。以下では、このときのオブジェクトに対する操作について、さらに具体的に説明する。
次に、図14~図18を参照して、本開示のいくつかの実施形態におけるユーザのグルーピングについて説明する。上述した行動情報の共有では、ユーザのグループごとに仮想空間1013を表示させ、各グループに分類されたユーザを表示するユーザオブジェクト1015をそれぞれの仮想空間1013に配置することが可能である。以下では、ユーザのグルーピングを、仮想空間1013を含むユーザインターフェース上で実行する例について、さらに具体的に説明する。
次に、図19を参照して、本開示の実施形態に係る情報処理装置のハードウェア構成について説明する。図19は、情報処理装置のハードウェア構成を説明するためのブロック図である。図示された情報処理装置900は、例えば、上記の実施形態における端末装置またはサーバを実現しうる。
本開示の実施形態は、例えば、上記で説明したような情報処理装置(端末装置またはサーバ)、システム、情報処理装置またはシステムで実行される情報処理方法、情報処理装置を機能させるためのプログラム、およびプログラムが記録された一時的でない有形の媒体を含みうる。
(1)1または複数のユーザの行動を示す第1の情報を取得する第1の情報取得部、
前記1または複数のユーザに関する前記第1の情報とは異なる第2の情報を取得する第2の情報取得部、および
前記第1の情報に基づいて構成され前記1または複数のユーザをそれぞれ表すユーザオブジェクトと、前記第2の情報に基づいて構成され前記ユーザオブジェクトが配置される仮想空間とを表示部に表示させる表示制御部
を備える情報処理装置。
(2)前記表示制御部は、前記1または複数のユーザのそれぞれに関する前記第2の情報を互いに組み合わせて、前記1または複数のユーザをそれぞれ表す前記ユーザオブジェクトに共通する前記仮想空間を設定する、前記(1)に記載の情報処理装置。
(3)前記第2の情報取得部は、前記1または複数のユーザの行動を示す情報を統計処理して得られる前記第2の情報を取得する、前記(2)に記載の情報処理装置。
(4)前記第2の情報取得部は、前記1または複数のユーザの行動パターンを示す前記第2の情報を取得する、前記(3)に記載の情報処理装置。
(5)前記表示制御部は、前記行動パターンに基づいて前記仮想空間の一部または全部を時間的に変更する、前記(4)に記載の情報処理装置。
(6)前記第2の情報取得部は、前記1または複数のユーザの属性を示す前記第2の情報を取得する、前記(2)~(5)のいずれか1項に記載の情報処理装置。
(7)前記第2の情報取得部は、前記1または複数のユーザの年齢、職業、または商品購入履歴に基づいて生成される前記第2の情報を取得する、前記(6)に記載の情報処理装置。
(8)前記表示制御部は、前記1または複数のユーザが分類されるグループごとに前記仮想空間を構成し、前記ユーザオブジェクトを前記各ユーザが分類されたグループに対応する前記仮想空間に配置する、前記(1)~(7)のいずれか1項に記載の情報処理装置。
(9)前記表示制御部は、前記グループに分類されていないユーザについて、該ユーザを分類することが推奨されるグループを示すテキストまたは画像を前記表示部に表示させる、前記(8)に記載の情報処理装置。
(10)前記表示制御部は、前記グループに分類されていないユーザについて、該ユーザが前記仮想空間に配置された場合の前記仮想空間の変化をプレビューとして表示させることによって、前記仮想空間に対応するグループが前記グループに分類されていないユーザを分類することが推奨されるグループであるか否かを示す、前記(8)または(9)に記載の情報処理装置。
(11)前記表示制御部は、前記ユーザオブジェクトの表示によって、前記各ユーザの行動を所定の基準で分類した行動ステータスを示す、前記(1)~(10)のいずれか1項に記載の情報処理装置。
(12)前記表示制御部は、前記ユーザオブジェクトの形状または動きによって前記各ユーザの行動ステータスを示す、前記(11)に記載の情報処理装置。
(13)前記表示制御部は、前記行動ステータスに対応するコンテナオブジェクトを前記仮想空間に配置し、前記ユーザオブジェクトを前記コンテナオブジェクト内に表示させることによって前記各ユーザの行動ステータスを示す、前記(11)または(12)に記載の情報処理装置。
(14)前記表示制御部は、前記1または複数のユーザの行動パターンに基づいて前記コンテナオブジェクトを表示させる、前記(13)に記載の情報処理装置。
(15)前記表示制御部は、前記1または複数のユーザが前記行動パターンに従った行動をしているか否かにかかわらず前記コンテナオブジェクトを表示させる、前記(14)に記載の情報処理装置。
(16)前記表示制御部は、前記コンテナオブジェクトの数またはサイズを、当該コンテナオブジェクトに対応する行動ステータスを有するユーザの数に応じて変更する、前記(13)~(15)のいずれか1項に記載の情報処理装置。
(17)前記表示制御部は、前記仮想空間に応じて前記行動ステータスを分類するための基準を変更する、前記(11)~(16)のいずれか1項に記載の情報処理装置。
(18)端末装置と、前記端末装置にサービスを提供する1または複数のサーバ装置とを含み、
前記端末装置と前記1または複数のサーバ装置とが協働して、
1または複数のユーザの行動を示す第1の情報を取得する機能、
前記1または複数のユーザに関する前記第1の情報とは異なる第2の情報を取得する機能、および
前記第1の情報に基づいて構成され前記1または複数のユーザをそれぞれ表すユーザオブジェクトと、前記第2の情報に基づいて構成され前記ユーザオブジェクトが配置される仮想空間とを表示部に表示させる機能
を提供するシステム。
(19)1または複数のユーザの行動を示す第1の情報を取得すること、
前記1または複数のユーザに関する前記第1の情報とは異なる第2の情報を取得すること、および
前記第1の情報に基づいて構成され前記1または複数のユーザをそれぞれ表すユーザオブジェクトと、前記第2の情報に基づいて構成され前記ユーザオブジェクトが配置される仮想空間とを表示部に表示させること
を含む情報処理方法。
(20)1または複数のユーザの行動を示す第1の情報を取得する機能、
前記1または複数のユーザに関する前記第1の情報とは異なる第2の情報を取得する機能、および
前記第1の情報に基づいて構成され前記1または複数のユーザをそれぞれ表すユーザオブジェクトと、前記第2の情報に基づいて構成され前記ユーザオブジェクトが配置される仮想空間とを表示部に表示させる機能
をコンピュータに実現させるためのプログラム。
100,300,500 端末装置
200,400,600 サーバ
110 センサ
120 行動認識部
130,530 通信部
140 制御部
150 UI
Claims (20)
- 1または複数のユーザの行動を示す第1の情報を取得する第1の情報取得部、
前記1または複数のユーザに関する前記第1の情報とは異なる第2の情報を取得する第2の情報取得部、および
前記第1の情報に基づいて構成され前記1または複数のユーザをそれぞれ表すユーザオブジェクトと、前記第2の情報に基づいて構成され前記ユーザオブジェクトが配置される仮想空間とを表示部に表示させる表示制御部
を備える情報処理装置。 - 前記表示制御部は、前記1または複数のユーザのそれぞれに関する前記第2の情報を互いに組み合わせて、前記1または複数のユーザをそれぞれ表す前記ユーザオブジェクトに共通する前記仮想空間を設定する、請求項1に記載の情報処理装置。
- 前記第2の情報取得部は、前記1または複数のユーザの行動を示す情報を統計処理して得られる前記第2の情報を取得する、請求項2に記載の情報処理装置。
- 前記第2の情報取得部は、前記1または複数のユーザの行動パターンを示す前記第2の情報を取得する、請求項3に記載の情報処理装置。
- 前記表示制御部は、前記行動パターンに基づいて前記仮想空間の一部または全部を時間的に変更する、請求項4に記載の情報処理装置。
- 前記第2の情報取得部は、前記1または複数のユーザの属性を示す前記第2の情報を取得する、請求項2に記載の情報処理装置。
- 前記第2の情報取得部は、前記1または複数のユーザの年齢、職業、または商品購入履歴に基づいて生成される前記第2の情報を取得する、請求項6に記載の情報処理装置。
- 前記表示制御部は、前記1または複数のユーザが分類されるグループごとに前記仮想空間を構成し、前記ユーザオブジェクトを前記各ユーザが分類されたグループに対応する前記仮想空間に配置する、請求項1に記載の情報処理装置。
- 前記表示制御部は、前記グループに分類されていないユーザについて、該ユーザを分類することが推奨されるグループを示すテキストまたは画像を前記表示部に表示させる、請求項8に記載の情報処理装置。
- 前記表示制御部は、前記グループに分類されていないユーザについて、該ユーザが前記仮想空間に配置された場合の前記仮想空間の変化をプレビューとして表示させることによって、前記仮想空間に対応するグループが前記グループに分類されていないユーザを分類することが推奨されるグループであるか否かを示す、請求項8に記載の情報処理装置。
- 前記表示制御部は、前記ユーザオブジェクトの表示によって、前記各ユーザの行動を所定の基準で分類した行動ステータスを示す、請求項1に記載の情報処理装置。
- 前記表示制御部は、前記ユーザオブジェクトの形状または動きによって前記各ユーザの行動ステータスを示す、請求項11に記載の情報処理装置。
- 前記表示制御部は、前記行動ステータスに対応するコンテナオブジェクトを前記仮想空間に配置し、前記ユーザオブジェクトを前記コンテナオブジェクト内に表示させることによって前記各ユーザの行動ステータスを示す、請求項11に記載の情報処理装置。
- 前記表示制御部は、前記1または複数のユーザの行動パターンに基づいて前記コンテナオブジェクトを表示させる、請求項13に記載の情報処理装置。
- 前記表示制御部は、前記1または複数のユーザが前記行動パターンに従った行動をしているか否かにかかわらず前記コンテナオブジェクトを表示させる、請求項14に記載の情報処理装置。
- 前記表示制御部は、前記コンテナオブジェクトの数またはサイズを、当該コンテナオブジェクトに対応する行動ステータスを有するユーザの数に応じて変更する、請求項13に記載の情報処理装置。
- 前記表示制御部は、前記仮想空間に応じて前記各ユーザの行動を分類するための基準を変更する、請求項11に記載の情報処理装置。
- 端末装置と、前記端末装置にサービスを提供する1または複数のサーバ装置とを含み、
前記端末装置と前記1または複数のサーバ装置とが協働して、
1または複数のユーザの行動を示す第1の情報を取得する機能、
前記1または複数のユーザに関する前記第1の情報とは異なる第2の情報を取得する機能、および
前記第1の情報に基づいて構成され前記1または複数のユーザをそれぞれ表すユーザオブジェクトと、前記第2の情報に基づいて構成され前記ユーザオブジェクトが配置される仮想空間とを表示部に表示させる機能
を提供するシステム。 - 1または複数のユーザの行動を示す第1の情報を取得すること、
前記1または複数のユーザに関する前記第1の情報とは異なる第2の情報を取得すること、および
前記第1の情報に基づいて構成され前記1または複数のユーザをそれぞれ表すユーザオブジェクトと、前記第2の情報に基づいて構成され前記ユーザオブジェクトが配置される仮想空間とを表示部に表示させること
を含む情報処理方法。 - 1または複数のユーザの行動を示す第1の情報を取得する機能、
前記1または複数のユーザに関する前記第1の情報とは異なる第2の情報を取得する機能、および
前記第1の情報に基づいて構成され前記1または複数のユーザをそれぞれ表すユーザオブジェクトと、前記第2の情報に基づいて構成され前記ユーザオブジェクトが配置される仮想空間とを表示部に表示させる機能
をコンピュータに実現させるためのプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201480011894.2A CN105190513B (zh) | 2013-03-08 | 2014-01-08 | 信息处理设备、系统、信息处理方法和程序 |
US14/763,603 US10969924B2 (en) | 2013-03-08 | 2014-01-08 | Information processing apparatus, method, and non-transitory computer readable medium that controls a representation of a user object in a virtual space |
JP2015504190A JP6254577B2 (ja) | 2013-03-08 | 2014-01-08 | 情報処理装置、システム、情報処理方法およびプログラム |
EP14760654.5A EP2966557A4 (en) | 2013-03-08 | 2014-01-08 | INFORMATION PROCESSING DEVICE, SYSTEM, INFORMATION PROCESSING AND PROGRAM |
US17/198,277 US20210200423A1 (en) | 2013-03-08 | 2021-03-11 | Information processing apparatus, method, and non-transitory computer readable medium that controls a representation of a user object in a virtual space |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-047040 | 2013-03-08 | ||
JP2013047040 | 2013-03-08 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/763,603 A-371-Of-International US10969924B2 (en) | 2013-03-08 | 2014-01-08 | Information processing apparatus, method, and non-transitory computer readable medium that controls a representation of a user object in a virtual space |
US17/198,277 Continuation US20210200423A1 (en) | 2013-03-08 | 2021-03-11 | Information processing apparatus, method, and non-transitory computer readable medium that controls a representation of a user object in a virtual space |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014136466A1 true WO2014136466A1 (ja) | 2014-09-12 |
Family
ID=51490998
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/050108 WO2014136466A1 (ja) | 2013-03-08 | 2014-01-08 | 情報処理装置、システム、情報処理方法およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (2) | US10969924B2 (ja) |
EP (1) | EP2966557A4 (ja) |
JP (1) | JP6254577B2 (ja) |
CN (1) | CN105190513B (ja) |
WO (1) | WO2014136466A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016082482A (ja) * | 2014-10-20 | 2016-05-16 | シャープ株式会社 | 画像記録装置 |
WO2021186755A1 (ja) | 2020-03-17 | 2021-09-23 | ソニーグループ株式会社 | 情報処理装置および情報処理方法、並びにプログラム |
WO2023218859A1 (ja) * | 2022-05-09 | 2023-11-16 | ユニ・チャーム株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
JP7505113B2 (ja) | 2021-03-29 | 2024-06-24 | 京セラ株式会社 | ウェアラブル端末装置、プログラムおよび表示方法 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150091891A1 (en) * | 2013-09-30 | 2015-04-02 | Dumedia, Inc. | System and method for non-holographic teleportation |
EP3633497A4 (en) * | 2017-05-24 | 2020-04-08 | Sony Corporation | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING PROCESS AND PROGRAM |
US11803764B2 (en) * | 2017-09-29 | 2023-10-31 | Sony Interactive Entertainment Inc. | Mobile and autonomous personal companion based on an artificial intelligence (AI) model for a user |
KR20190106950A (ko) * | 2019-08-31 | 2019-09-18 | 엘지전자 주식회사 | 지능형 디바이스 및 그 제어 방법 |
US10987592B1 (en) | 2020-06-05 | 2021-04-27 | 12traits, Inc. | Systems and methods to correlate user behavior patterns within an online game with psychological attributes of users |
US11206263B1 (en) | 2021-01-25 | 2021-12-21 | 12traits, Inc. | Systems and methods to determine content to present based on interaction information of a given user |
US11616701B2 (en) * | 2021-02-22 | 2023-03-28 | Cisco Technology, Inc. | Virtual proximity radius based web conferencing |
US11727424B2 (en) | 2021-06-04 | 2023-08-15 | Solsten, Inc. | Systems and methods to correlate user behavior patterns within digital application environments with psychological attributes of users to determine adaptations to the digital application environments |
US11654371B2 (en) * | 2021-07-30 | 2023-05-23 | Sony Interactive Entertainment LLC | Classification of gaming styles |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006345269A (ja) * | 2005-06-09 | 2006-12-21 | Sony Corp | 情報処理装置および方法、並びにプログラム |
JP2010134802A (ja) | 2008-12-05 | 2010-06-17 | Sony Corp | 情報処理装置、及び情報処理方法 |
JP2011081431A (ja) | 2009-10-02 | 2011-04-21 | Sony Corp | 行動パターン解析システム、携帯端末、行動パターン解析方法、及びプログラム |
JP2013008232A (ja) * | 2011-06-24 | 2013-01-10 | Sony Corp | 情報処理装置とサーバと情報処理システムおよび情報処理方法とプログラム |
JP2013020587A (ja) * | 2011-07-14 | 2013-01-31 | Nec Corp | 情報処理システム、ユーザの行動促進方法、情報処理装置及びその制御方法と制御プログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090106672A1 (en) * | 2007-10-18 | 2009-04-23 | Sony Ericsson Mobile Communications Ab | Virtual world avatar activity governed by person's real life activity |
US8531447B2 (en) * | 2008-04-03 | 2013-09-10 | Cisco Technology, Inc. | Reactive virtual environment |
US8639636B2 (en) * | 2008-08-15 | 2014-01-28 | At&T Intellectual Property I, L.P. | System and method for user behavior modeling |
US9024977B2 (en) * | 2010-08-02 | 2015-05-05 | International Business Machines Corporation | Resizing objects in regions of virtual universes |
US20120253489A1 (en) | 2011-03-28 | 2012-10-04 | Dugan Brian M | Systems and methods for fitness and video games |
-
2014
- 2014-01-08 EP EP14760654.5A patent/EP2966557A4/en not_active Ceased
- 2014-01-08 CN CN201480011894.2A patent/CN105190513B/zh active Active
- 2014-01-08 US US14/763,603 patent/US10969924B2/en active Active
- 2014-01-08 JP JP2015504190A patent/JP6254577B2/ja active Active
- 2014-01-08 WO PCT/JP2014/050108 patent/WO2014136466A1/ja active Application Filing
-
2021
- 2021-03-11 US US17/198,277 patent/US20210200423A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006345269A (ja) * | 2005-06-09 | 2006-12-21 | Sony Corp | 情報処理装置および方法、並びにプログラム |
JP2010134802A (ja) | 2008-12-05 | 2010-06-17 | Sony Corp | 情報処理装置、及び情報処理方法 |
JP2011081431A (ja) | 2009-10-02 | 2011-04-21 | Sony Corp | 行動パターン解析システム、携帯端末、行動パターン解析方法、及びプログラム |
JP2013008232A (ja) * | 2011-06-24 | 2013-01-10 | Sony Corp | 情報処理装置とサーバと情報処理システムおよび情報処理方法とプログラム |
JP2013020587A (ja) * | 2011-07-14 | 2013-01-31 | Nec Corp | 情報処理システム、ユーザの行動促進方法、情報処理装置及びその制御方法と制御プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2966557A4 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016082482A (ja) * | 2014-10-20 | 2016-05-16 | シャープ株式会社 | 画像記録装置 |
WO2021186755A1 (ja) | 2020-03-17 | 2021-09-23 | ソニーグループ株式会社 | 情報処理装置および情報処理方法、並びにプログラム |
KR20220154103A (ko) | 2020-03-17 | 2022-11-21 | 소니그룹주식회사 | 정보 처리 장치 및 정보 처리 방법, 그리고 프로그램 |
JP7505113B2 (ja) | 2021-03-29 | 2024-06-24 | 京セラ株式会社 | ウェアラブル端末装置、プログラムおよび表示方法 |
WO2023218859A1 (ja) * | 2022-05-09 | 2023-11-16 | ユニ・チャーム株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
CN105190513B (zh) | 2019-03-01 |
EP2966557A4 (en) | 2016-10-19 |
CN105190513A (zh) | 2015-12-23 |
JP6254577B2 (ja) | 2017-12-27 |
US20210200423A1 (en) | 2021-07-01 |
JPWO2014136466A1 (ja) | 2017-02-09 |
US10969924B2 (en) | 2021-04-06 |
EP2966557A1 (en) | 2016-01-13 |
US20150365449A1 (en) | 2015-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6254577B2 (ja) | 情報処理装置、システム、情報処理方法およびプログラム | |
US10990613B2 (en) | Information processing apparatus and information processing method | |
US10972562B2 (en) | Information processing apparatus, information processing method, and program | |
JP2013200793A (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP6443340B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP2014222439A (ja) | 情報処理装置、パーツ生成利用方法及びプログラム | |
CN104115180A (zh) | 信息处理设备、信息处理方法和程序 | |
JP2014153818A (ja) | 情報処理装置、情報処理方法、およびプログラム | |
Paterson et al. | The world through Glass: developing novel methods with wearable computing for urban videographic research | |
WO2015190141A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
WO2016158003A1 (ja) | 情報処理装置、情報処理方法及びコンピュータプログラム | |
JP6233412B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US20150121307A1 (en) | Information processing device, information processing method, and program | |
JP7336780B1 (ja) | プログラム、方法、情報処理装置、システム | |
JP6096341B1 (ja) | 表示制御方法、端末、プログラム、及び情報処理装置 | |
WO2016147693A1 (ja) | 情報処理装置、情報処理方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480011894.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14760654 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015504190 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14763603 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014760654 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |