WO2024101001A1 - Information processing system, information processing method, and program for communication points regarding events - Google Patents

Information processing system, information processing method, and program for communication points regarding events Download PDF

Info

Publication number
WO2024101001A1
WO2024101001A1 PCT/JP2023/034046 JP2023034046W WO2024101001A1 WO 2024101001 A1 WO2024101001 A1 WO 2024101001A1 JP 2023034046 W JP2023034046 W JP 2023034046W WO 2024101001 A1 WO2024101001 A1 WO 2024101001A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
communication
virtual space
room
information
Prior art date
Application number
PCT/JP2023/034046
Other languages
French (fr)
Inventor
Ryo Fukazawa
Takeshi Onodera
Yukio YAKUSHIJIN
Masayuki Inoue
Kenichiro SHIROTA
Motohiro ENDO
Wataru Yoshida
Masaki OSHIRO
Chihiro Tanaka
Hibiki SUDO
Kanta NAKANO
Original Assignee
Sony Group Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corporation filed Critical Sony Group Corporation
Publication of WO2024101001A1 publication Critical patent/WO2024101001A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/157Conference systems defining a virtual conference space and using avatars or agents
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • A63F13/795Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for finding other players; for building a team; for providing a buddy list
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1822Conducting the conference, e.g. admission, detection, selection or grouping of participants, correlating users to one or more conference sessions, prioritising transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1827Network arrangements for conference optimisation or adaptation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41415Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals

Definitions

  • the present disclosure relates to an information processing system, an information processing method, and a program.
  • a system capable of performing communication through an avatar operated by each user in a virtual space in which 3D models are disposed has become widespread.
  • a video from an any viewpoint (free viewpoint) according to a user operation is generated and provided to the user as the video of the virtual space.
  • the video of the virtual space is provided using a display device such as a head mounted display (HMD) that covers the entire field of view of the user, a smartphone, a tablet terminal, or a personal computer (PC).
  • HMD head mounted display
  • PC personal computer
  • PTL 1 discloses a technology for establishing communication between communication terminals used by respective users in a case where scheduled users gather in a predetermined space and at a predetermined time on the virtual space.
  • the present disclosure proposes an information processing system, an information processing method, and a program capable of further enhancing convenience of communication between some users at a predetermined place in a virtual space.
  • an information processing apparatus including: circuitry configured to: provide information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed, wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space; and provide information regarding the communication based on the avatar entering a second area located outside the first area, the event being a subject of conversation in the communication room.
  • an information processing method including: providing information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed, wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space; and providing information regarding the communication based on the avatar entering a second area located outside the first area, the event being a subject of conversation in the communication room.
  • an information processing apparatus including: circuitry configured to: provide, in a communication room related to a communication point where communication between users is performed, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, information regarding selection of entry to a communication room being provided to a user based on an avatar operated by the user entering a first area that is the communication point where communication between users is performed, wherein information regarding the communication is provided based on the avatar entering a second area located outside the first area, and the event being a subject of conversation in the communication room.
  • an information processing method including: providing, in a communication room related to a communication point where communication between users is performed, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, information regarding selection of entry to a communication room being provided to a user based on an avatar operated by the user entering a first area that is the communication point where communication between users is performed, information regarding the communication being provided based on the avatar entering a second area located outside the first area, and the event being a subject of conversation in the communication room.
  • Fig. 1 is a diagram illustrating an overall configuration of a virtual space provision system according to an embodiment of the present disclosure.
  • Fig. 2 is a diagram for explaining each space included in a virtual space S.
  • Fig. 3 is a diagram for explaining a space by genre included in a sport space S3 according to an embodiment of the present disclosure.
  • Fig. 4 is a block diagram illustrating an example of a configuration of a virtual space management server 200 according to an embodiment of the present disclosure.
  • Fig. 5 is a diagram illustrating a specific example of service servers included in a backend system 30 according to an embodiment of the present disclosure.
  • Fig. 6 is a block diagram showing an example of a configuration of a client terminal 10 according to an embodiment of the present disclosure.
  • Fig. 1 is a diagram illustrating an overall configuration of a virtual space provision system according to an embodiment of the present disclosure.
  • Fig. 2 is a diagram for explaining each space included in a virtual space S.
  • Fig. 3 is a diagram for
  • FIG. 7 is a sequence diagram illustrating an example of a flow of connection processing to a virtual space according to an embodiment of the present disclosure.
  • Fig. 8 is a diagram illustrating an example of a display screen according to an embodiment of the present disclosure.
  • Fig. 9 is a diagram illustrating an example of a display screen according to an embodiment of the present disclosure.
  • Fig. 10 is a diagram illustrating an example of a simple map of an IP content specific area according to an embodiment of the present disclosure.
  • Fig. 11 is a diagram illustrating a mechanism of game reproduction using bone data of a player according to an embodiment of the present disclosure.
  • Fig. 12 is a diagram illustrating a display screen inside a bar on which a talking table 600 is disposed according to an embodiment of the present disclosure.
  • FIG. 13 is a diagram for explaining an example of the shape of the talking table 600.
  • Fig. 14 is a diagram for describing a case where a state in a talking room is disclosed according to a distance to the talking table 600 according to an embodiment of the present disclosure.
  • Fig. 15 is a view illustrating a display screen of the terminal (client terminal 10) used by the user who operates the user avatar in a case where the position in the virtual space of the user avatar operated by the user is included in the area E2.
  • Fig. 16 is a view illustrating a display screen of the terminal (client terminal 10) used by the user who operates the user avatar in a case where the position in the virtual space of the user avatar operated by the user is included in the area E1.
  • Fig. 15 is a view illustrating a display screen of the terminal (client terminal 10) used by the user who operates the user avatar in a case where the position in the virtual space of the user avatar operated by the user is included in the area E1.
  • Fig. 16 is a view
  • FIG. 17 is a diagram illustrating an example of effect display of the talking table 600 according to the excitement in the talking room according to an embodiment of the present disclosure.
  • Fig. 18 is a sequence diagram illustrating an example of operation processing from generation of the talking table 600 to entry according to an embodiment of the present disclosure.
  • Fig. 19 is a diagram illustrating details of communication connection between a client terminal 10 and each server at the time of entering the talking room according to an embodiment of the present disclosure.
  • Fig. 20 is a view illustrating an example of a talking room screen according to an embodiment of the present disclosure.
  • Fig. 21 is a diagram illustrating another example of the talking room screen according to an embodiment of the present disclosure.
  • Fig. 22 is a diagram for explaining a mechanism of event coordination according to an embodiment of the present disclosure.
  • FIG. 23 is a diagram illustrating an example of a display screen in a live stadium according to an embodiment of the present disclosure.
  • Fig. 24 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure.
  • Fig. 25 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure.
  • Fig. 26 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure.
  • Fig. 27 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure.
  • Fig. 28 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure.
  • Fig. 24 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure.
  • Fig. 25 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure.
  • Fig. 29 is a diagram illustrating an example of a camera position according to an embodiment of the present disclosure.
  • Fig. 30 is a diagram illustrating a game reproduction UI in a case of a camera position 810: Basic illustrated in Fig. 29.
  • Fig. 31 is a diagram illustrating a game reproduction UI in a case of a camera position 811: Bird's eye illustrated in Fig. 29.
  • Fig. 32 is a diagram for describing a screen outline of a game reproduction UI according to an embodiment of the present disclosure.
  • Fig. 33 is a diagram illustrating a display example of a chapter list according to an embodiment of the present disclosure.
  • Fig. 34 is a diagram illustrating a display example of a game reproduction UI in the case of an avatar mode according to an embodiment of the present disclosure.
  • Fig. 35 is a diagram for explaining a highlight playback of a game according to an embodiment of the present disclosure.
  • FIG. 1 is a diagram illustrating an overall configuration of a virtual space provision system according to an embodiment of the present disclosure.
  • the virtual space provision system (an example of an information processing system) according to an embodiment of the present disclosure is an information processing system including one or more virtual space management servers 200 (an example of an information processing device), a client terminal 10 (an example of a terminal) used by each user, and a backend system 30.
  • the virtual space management server 200 constructs (generates) and manages a virtual space (VR) in which a virtual object is disposed, and provides virtual space information to one or more client terminals 10 communicably connected via a network 40.
  • the virtual space information includes, for example, information regarding a virtual object disposed in the virtual space, map information regarding the virtual space, information regarding an event (hereinafter, also referred to as a virtual event) taking place in the virtual space, positional information regarding each user in the virtual space, and the like.
  • the virtual space can be reproduced (copied) by each client terminal 10 on the basis of these pieces of information.
  • each client terminal 10 arranges the virtual object on the basis of the map information regarding the virtual space, and draws the virtual space video from the user viewpoint (user avatar viewpoint).
  • the virtual space management server 200 may generate a video from an any viewpoint (for example, a user viewpoint) in the virtual space to transmit the video to the client terminal 10.
  • the virtual space for example, a virtual object formed by a 3DCG model is disposed.
  • the virtual space may be a space in which an action of the user is limited to a behavior scenario prepared in advance like a game, or may be a space in which the user can freely act without being limited to the behavior scenario.
  • the virtual space may be a space in which an interaction with another user can be performed by bidirectionally transmitting and receiving an audio, a character, or the like.
  • the virtual space may be a space designed according to the original world view, or may be a space called a so-called mirror world that faithfully reproduces buildings, facilities, roads, cities, streets, or the like existing in the real space.
  • interaction with other users, exchange of education, or social activities such as work, and economic activities such as sales or purchase of products can also be performed.
  • the virtual space may be a space called a metaverse.
  • the virtual space there may be a user avatar that is a virtual self of the user.
  • the user avatar is generated by, for example, a character 3D model. Each user can arbitrarily customize the appearance of his/her character 3D model.
  • the user avatar may be operated by the user and moved in the virtual space.
  • a gesture or a facial expression of the user avatar may also be operated by the user.
  • the positional information regarding the user in the virtual space may be information indicating the position of the user avatar.
  • the user viewpoint in the virtual space may be a viewpoint of the user avatar or a viewpoint including the user avatar within the view angle. The user viewpoint can be arbitrarily switched by a user operation.
  • the user avatar is one of the virtual objects disposed in the virtual space.
  • the information regarding the user avatar of each user is also shared as virtual space information by a large number of client terminals 10 communicably connected to the virtual space management server 200.
  • each user can communicate by an audio, a character, or an image in the virtual space.
  • the communication may be performed via the virtual space management server 200 or via the backend system 30.
  • the backend system 30 includes servers (service servers 31, 32, 33, ...) that provide various services to users who use the virtual space. Examples thereof include a server that provides audio communication, a server that provides character communication, and a server that provides image (moving image and still image) communication.
  • the client terminal 10 acquires communication channel information (an example of connection information) from the virtual space management server 200, and is communicably connected to a predetermined service server. Then, the client terminal 10 acquires communication channel information via the service server and can communicate with another client terminal 10 communicably connected to a predetermined service server.
  • communication channel information an example of connection information
  • a virtual space system 20 includes a plurality of virtual space management servers 200 (200a, 200b, 200c, ).
  • virtual space management servers 200 connectable to 100 terminals are prepared, and each virtual space management server 200 provides the same virtual space (the same virtual object and the same map information).
  • Each virtual space management server 200 manages positional information (positional information regarding the 100 persons) regarding the users corresponding to the 100 terminals connected for communication. As a result, it is possible to provide the same virtual space for a total of 1000 users every 100 users. In each virtual space, rough elements such as a flow of time and execution of a virtual event are synchronized, but an interaction between users can be performed for each virtual space.
  • server increase/decrease management may be performed by a service provided by the backend system 30.
  • the backend system 30 may construct a new virtual space management server 200 that provides the same virtual space as the one virtual space management server 200 (the same virtual object and the same map information). Note that a predetermined number of virtual space management servers 200 may be constructed in advance.
  • the server environment is not particularly limited.
  • the virtual space management server 200 may be a physical server or a virtual server executed on the physical server.
  • the physical server or the virtual server described here may be a server provided by a hosting service, or may be an own server prepared by a business operator who provides a service for providing a virtual space.
  • the function of one virtual space management server 200 may be realized by a plurality of physical servers or a plurality of virtual servers.
  • Each virtual space management server 200 manages the positional information regarding the users corresponding to the plurality of client terminals 10 connected for communication.
  • the client terminal 10 is a terminal used by the user.
  • the client terminal 10 is realized by a smartphone, a tablet terminal, a personal computer (PC), a head mounted display (HMD) covering the entire field of view, a glasses-type device, a projector, a console game machine, or the like.
  • the client terminal 10 reproduces the virtual space on the basis of the information regarding the virtual space received from the virtual space management server 200.
  • the client terminal 10 generates a video from the user viewpoint in the virtual space, and displays and outputs the video.
  • the client terminal 10 also outputs an audio in the virtual space as appropriate.
  • the client terminal 10 may receive the video from the user viewpoint generated by the virtual space management server 200 from the virtual space management server 200 and display and output the video.
  • the client terminal 10 can control the virtual object disposed in the virtual space according to the user operation. For example, the client terminal 10 controls the position, the gesture, the facial expression, and the like of the user avatar according to the user operation. In addition, the client terminal 10 transmits information (for example, the position, the gesture, the facial expression, and the like of the user avatar) regarding the virtual object changed according to the user operation to the virtual space management server 200 in real time.
  • information for example, the position, the gesture, the facial expression, and the like of the user avatar
  • the client terminal 10 may transmit various types of information to the virtual space management server 200, another client terminal 10, or the backend system 30 according to a user operation.
  • the client terminal 10 may transmit login information to the virtual space management server 200 or the backend system 30 to request user authentication.
  • the client terminal 10 may transmit the audio of the user, the input character information, the selected image, and the like as the communication information to the virtual space management server 200, another client terminal 10, or the backend system 30.
  • the virtual space S As an example of the virtual space, in an embodiment of the present disclosure, there is a space for users to interact with each other centering on various kinds of contents (including IP content to be described later) such as a movie, music, sports, or animation.
  • Fig. 2 is a diagram for explaining each space included in a virtual space S.
  • the virtual space S provided in the virtual space system 20 includes a user personal space S1 which is a virtual space for an individual user, a public space S2 which is a virtual space in which users interact with each other regardless of the content, and each virtual space in which users interact with each other regarding each content.
  • each virtual space in which the users interact with each other regarding each content examples include a sport space S3 that is a virtual space in which users interact with each other regarding sports, a music space S4 that is a virtual space in which users interact with each other regarding music, a movie space S5 that is a virtual space in which users interact with each other regarding movies, and an animation space S6 that is a virtual space in which users interact with each other regarding animation.
  • the content mentioned here is an example, and the content according to the present disclosure is not limited thereto.
  • each virtual space in which users interact with each other regarding each content may be a virtual space for each genre that is further hierarchized.
  • Fig. 3 is a diagram for explaining a space by genre included in the sport space S3 according to an embodiment of the present disclosure. As illustrated in Fig. 3, examples thereof include a soccer space S3-1 in which users particularly interact with each other regarding soccer, a basketball space S3-2 in which users particularly interact with each other regarding basketball, and a tennis space S3-3 in which users particularly interact with each other regarding tennis. Furthermore, an example of a virtual space in a lower hierarchy of a space by genre of each sport illustrated in Fig. 3 may include a virtual space in which users (fans) of each sport team interact with each other be.
  • the virtual space is not limited to being hierarchized by genre.
  • the virtual space may be a space in which each space corresponding to various IP content exists in parallel.
  • the virtual space according to an embodiment of the present disclosure may be a virtual space in which a space corresponding to each of a certain baseball team, a certain soccer team, and a certain animation exists in parallel.
  • the virtual space by content a virtual space for users who are interested in the content to interact with each other can be considered.
  • the user can have a conversation about the content with another user having the same preference and the same interest and enjoy the content more.
  • the virtual space system 20 may provide all or some of the virtual spaces as illustrated in Figs. 2 and 3.
  • the virtual space system 20 may prepare the user personal space S1, the public space S2, and a virtual space for interaction between specific sport team fans.
  • the virtual space system 20 provides a virtual space for interaction between fans of a specific soccer team.
  • Fig. 4 is a block diagram illustrating an example of a configuration of a virtual space management server 200 according to an embodiment of the present disclosure. As illustrated in Fig. 4, the virtual space management server 200 includes a communication unit 210, a control unit 220, and a storage unit 230.
  • the communication unit 210 transmits and receives data to and from an external device in a wired or wireless manner.
  • the communication unit 210 is communicably connected to each of the client terminal 10 and the backend system 30 by using, for example, a wired/wireless local area network (LAN), a Wi-Fi (registered trademark), a Bluetooth (registered trademark), a mobile communication network (long term evolution (LTE), fourth generation mobile communication system (4G), fifth generation mobile communication system (5G)), or the like.
  • LAN local area network
  • Wi-Fi registered trademark
  • Bluetooth registered trademark
  • LTE long term evolution
  • 4G fourth generation mobile communication system
  • 5G fifth generation mobile communication system
  • Control unit 220 The control unit 220 functions as an arithmetic processing device and a control device, and controls the overall operation in the virtual space management server 200 according to various programs.
  • the control unit 220 is realized by an electronic circuit such as a central processing unit (CPU) and a microprocessor, for example.
  • the control unit 220 may include a read only memory (ROM) that stores programs, operation parameters, and the like to be used, and a random access memory (RAM) that temporarily stores parameters and the like that change appropriately.
  • ROM read only memory
  • RAM random access memory
  • the control unit 220 appropriately performs processing on the basis of data received from an external device, and controls storage in the storage unit 230 and transmission of data to the external device.
  • the control unit 220 also functions as a virtual space management unit 221, a content management unit 222, and a user management unit 223.
  • the virtual space management unit 221 manages the virtual space. Specifically, the virtual space management unit 221 performs generation, acquisition, addition, update, and the like of various types of information for constructing (generating) the virtual space.
  • the various types of information are virtual space information such as, for example, information regarding a 3DCG model which is a virtual object disposed in the virtual space, color settings, an image, map information, virtual event information, sound effects, and back ground music (BGM).
  • the virtual object includes a user-operable object, a user-inoperable object, a background image, a staging effect, and the like.
  • the virtual object also includes information regarding each user avatar disposed in the virtual space. Examples of the information regarding each user avatar include appearance information regarding the user avatar, real time user positional information, real time facial expression or gesture information regarding the user avatar, and a profile of the corresponding user.
  • the information regarding each user avatar is managed by the user management unit 223 described later.
  • the virtual event information includes information regarding time, progress, performance, and the like of the virtual event.
  • the virtual space management unit 221 may notify the client terminal 10 that a predetermined virtual event is perform at a preset time, or may notify the client terminal 10 that a virtual event corresponding to a case where a predetermined trigger is detected in the virtual space is performed.
  • the virtual event information includes information necessary for executing the virtual event.
  • the virtual event may be a content.
  • the notification of the virtual event to the client terminal 10 may include information instructing to acquire predetermined content from the backend system 30 and execute the content at a predetermined place in the virtual space.
  • the virtual space management unit 221 appropriately transmits the virtual space information to the client terminal 10. Furthermore, the virtual space management unit 221 can also realize communication between users sharing the virtual space by transmitting communication information such as a character, an audio, or an image received from the client terminal 10 to another client terminal 10.
  • the virtual space management unit 221 receives the information regarding the user operation from the client terminal 10 to transmit the information as the virtual space information to the other client terminals 10 in real time. In addition, the virtual space management unit 221 performs the processing of calculating the collision detection according to the information regarding the user operation received from the client terminal 10 to transmit the calculation result to each client terminal 10 at appropriate times. As a result, the statuses of the virtual spaces reproduced by the respective client terminals 10 are synchronized.
  • the information regarding the user operation may include operations related to a user avatar movement, such as moving and jumping of the user avatar, and operations related to an emotional expression of the user avatar, such as emotes that cause the user avatar to perform a pose, a gesture, or the like, and a stamp displayed above the head of the user avatar.
  • the content management unit 222 manages information regarding the content shared in the virtual space. For example, the content management unit 222 manages information indicating the content currently shared in a content sharing area provided in the virtual space.
  • the content sharing area may be, for example, a virtual object of one or more large displays disposed in a virtual space, or may be a stage, a forum, a field, or the like on which a 3D content or the like is reproduced.
  • the content itself may be transmitted from the backend system 30 to the client terminal 10.
  • Such content management may be managed by the virtual space management unit 221 as one of the virtual event information described above.
  • the user management unit 223 manages information regarding a user who uses the virtual space provided by the virtual space management server 200. More specifically, the user management unit 223 manages information regarding a user corresponding to the client terminal 10 to be connected for communication.
  • the information regarding the user includes, for example, profile of the user (user name, user ID, icon image, and the like), appearance information regarding the user avatar (image, character type, component, or the like), status information regarding the user avatar, information regarding a virtual item owned by the user, information regarding virtual currency or points available in the virtual space by the user, real time user positional information in the virtual space (that is, positional information regarding the user avatar), and motion information about a real time gesture, a facial expression, a posture, or a motion state of the user avatar.
  • the information regarding the user is transmitted from the client terminal 10. Furthermore, part of the information regarding the user may be transmitted from the backend system 30.
  • the storage unit 230 is realized by a ROM that stores programs, operation parameters, and the like used for processing of the control unit 220, and a RAM that temporarily stores parameters and the like that change as appropriate.
  • the storage unit 230 stores virtual space information.
  • the backend system 30 includes servers that provide various services. Each server may be provided by a different business operator. In addition, the service provided by each server can be appropriately customized and used by a business operator providing the virtual space.
  • Each server of the backend system 30 may be communicably connected to the virtual space management server 200 to transmit and receive information, or may communicate with the client terminal 10 to transmit and receive information.
  • the virtual space management server 200 can provide various services to the user who uses the virtual space in cooperation with each server.
  • Fig. 5 is a diagram illustrating a specific example of a service servers included in the backend system 30 according to an embodiment of the present disclosure.
  • the service servers illustrated in Fig. 5 are an example, and the present disclosure is not limited thereto. Each service server will be described below.
  • a user information management server 301 manages information regarding a user who uses the virtual space.
  • the user information management server 301 stores account information regarding the user, a profile of the user, a user characteristics (age, gender, country, language used, history, liking/preference, viewing history of the content, participation history in virtual events, interaction history in the virtual space, and the like), item information possessed by the user, appearance information regarding the user avatar, status information regarding the user avatar, and the like in a data store (storage unit).
  • the user information management server 301 can also perform authentication of the user using the account information regarding the user.
  • the user information management server 301 performs user authentication in response to a request from the client terminal 10, and in a case where the authentication is successful, calls user information such as a profile of the user and appearance information regarding the user avatar from the data store to transmit the user information to the client terminal 10.
  • a hosting server 302 performs server multiplexing according to the number of terminals connected to the virtual space system 20, that is, server increase/decrease management. For example, in a case where the number of terminals connected to one virtual space management server 200 exceeds a predetermined number (for example, 100), the hosting server 302 constructs a new virtual space management server 200 that provides the same virtual space as the one virtual space management server 200 (the same virtual object and the same map information).
  • a predetermined number for example, 100
  • a matching server 303 determines to which virtual space management server 200 the client terminal 10 that has made the connection request after the user authentication is completed is connected. As described above, for example, in a case where the number of connections is limited for each virtual space management server 200, the matching server 303 acquires information regarding the current number of connections of each virtual space management server 200 from the virtual space system 20, and instructs the client terminal 10 to be communicably connected to the virtual space management server 200 having a free space. Furthermore, the matching server 303 may determine the virtual space management server 200 to be connected according to the user information in addition to the number of connections. For example, the matching server 303 may instruct users having similar user characteristics to communicably connect to the same virtual space management server 200.
  • a text chat server 304 is a server that provides a mechanism in which users using a virtual space have a conversation using text (character information) as a communication means. Specifically, the text chat server 304 creates a virtual chat room (text channel) and performs control so that character information can be exchanged between users who are in the chat room. The chat room is generated in response to a request from the virtual space management server 200 or the client terminal 10. Furthermore, the text chat server 304 returns session information (connection information) for connecting to the generated chat room to the virtual space management server 200 or the client terminal 10. The client terminal 10 is communicably connect to the text chat server 304 on the basis of the session information and executes text chat.
  • a voice chat server 305 is a server that provides a mechanism in which users using a virtual space have a conversation using a voice (audio information) as a communication means. Specifically, the voice chat server 305 creates a virtual chat room (voice channel) and performs control so that audio information can be exchanged between users who are in the chat room. The chat room is generated in response to a request from the virtual space management server 200 or the client terminal 10. In addition, the voice chat server 305 returns session information (connection information) for connecting to the generated chat room to the virtual space management server 200 or the client terminal 10. The client terminal 10 is communicably connect to the voice chat server 305 on the basis of the session information and executes voice chat.
  • the communication means between the users using the virtual space is not limited to the above-described character information and audio information, but for example, it is also assumed that a moving image (video/streaming video) is used.
  • a server that provides a mechanism for performing conversation using a moving image may be included in the backend system 30.
  • a content distribution server 306 controls distribution of the content viewed in the virtual space.
  • the content distribution server 306 can acquire and distribute the content from various content servers (servers that store content).
  • a virtual space is reproduced (virtual objects are disposed on the basis of map information) by an application (also referred to as a client application) operating in each client terminal 10, and a virtual space video from a user viewpoint is generated.
  • the content distribution server 306 is communicably connected to the client terminal 10 to transmit the content incorporated in the virtual space reproduced by the client terminal 10 to the client terminal 10.
  • the content incorporated in the virtual space is, for example, a moving image displayed on a large display (virtual object) disposed in the virtual space.
  • the content incorporated in the virtual space is, for example, a 3D video displayed in a predetermined field in the virtual space.
  • a talking table server 307 manages a talking room (communication room) associated with a talking table (virtual object) disposed at an any place in the virtual space.
  • each user can have a conversation using one or more different types of communication means (for example, text chat and voice chat).
  • the generation of the talking room is performed in response to a request from the virtual space management server 200 or the client terminal 10.
  • the talking table server 307 returns session information (connection information) for connecting to the generated talking room to the virtual space management server 200 or the client terminal 10. Details of the talking room will be described below with reference to Figs. 12 to 19.
  • the talking table is an example of a virtual object disposed at a communication point set at an any place in the virtual space as a mark of a place where a plurality of users gathers for an interaction in the virtual space.
  • Any virtual object serving as a mark may be disposed at the communication point, and may be, for example, a chair, a pole, a parasol, a stage, or the like in addition to the table.
  • the event coordination server 308 controls performance in the virtual space according to development of an event taking place in the real space.
  • the performance in the virtual space may include a virtual event.
  • an event coordination server 308 can coordinate the performance in the virtual space with the event in the real space by executing the corresponding performance in real time in the virtual space according to the development of the game performed in the real space. More specifically, the event coordination server 308 instructs the client terminal 10 to execute performance.
  • the user can enjoy an experience of enjoying excitement together in the virtual space with other users viewing the same game while viewing the game (an example of an event in the real space) performed in the real space, for example.
  • the user may view the game in the real space on, for example, a television, the Internet, a radio, or the like, or may view the game in the virtual space. Details of event coordination are described below with reference to Figs. 20 to 26.
  • a trading management server 309 realizes the trade of items, products, tickets, and the like in the virtual space, and stores data regarding the trade. For trade in the virtual space, for example, virtual currency can be used.
  • the trading management server 309 may be a server that provides a charging system.
  • a data analysis server 310 analyzes various pieces of data such as the behavior of each user avatar in the virtual space, the user characteristics of the user avatar performing a specific action, and the like. The analysis result may be presented to a business operator or the like that provides the virtual space.
  • Fig. 6 is a block diagram showing an example of a configuration of the client terminal 10 according to an embodiment of the present disclosure. As illustrated in Fig. 6, the client terminal 10 includes a communication unit 110, a control unit 120, an operation input unit 130, a sensor 140, a display unit 150, an audio output unit 160, and a storage unit 170.
  • the communication unit 110 is communicably connected to the virtual space management server 200 in a wired or wireless manner to transmit and receive data.
  • the communication unit 110 can perform communication using, for example, a wired/wireless LAN, a Wi-Fi (registered trademark), a Bluetooth (registered trademark), infrared communication, a mobile communication network (fourth-generation mobile communication system (4G), fifth-generation mobile communication system (5G)), or the like.
  • Control unit 120 functions as an arithmetic processing device and a control device, and controls an overall operation in the client terminal 10 in accordance with various programs.
  • the control unit 120 is realized by, for example, an electronic circuit such as a CPU or a microprocessor.
  • the control unit 120 may include a ROM that stores programs to be used, operation parameters, and the like, and a RAM that temporarily stores parameters and the like that change as appropriate.
  • the control unit 120 also functions as a virtual space processing unit 121 and a display control unit 122.
  • the virtual space processing unit 121 performs various processes for providing the user with the virtual space experience in appropriate cooperation with the virtual space management server 200 and the backend system 30. Such processing may be performed by a client application downloaded in advance by the client terminal 10.
  • the virtual space processing unit 121 performs, for example, user registration processing and login processing for using the virtual space. In the user registration processing and the login processing, for example, data is appropriately transmitted and received to and from the user information management server 301 via the communication unit 110.
  • the virtual space processing unit 121 generates the virtual space on the basis of the virtual space information received from the virtual space management server 200. Specifically, the virtual space processing unit 121 arranges the virtual objects on the basis of the map information received from the virtual space management server 200 and reproduces the virtual space.
  • the virtual space processing unit 121 may incorporate various pieces of information acquired from the backend system 30 into the virtual space.
  • the virtual space processing unit 121 may display the video distributed from the content distribution server 306 on a large display disposed in the virtual space.
  • the virtual space processing unit 121 controls a position, a facial expression of the face, a gesture (emotional expression by the hand or the entire body), and the motion state (sitting, standing, jumping, running, etc.) of the user avatar disposed in the virtual space according to the user operation.
  • the virtual space processing unit 121 continuously transmits the user operation information to the virtual space management server 200 in real time.
  • the virtual space processing unit 121 reflects the information regarding another user avatar continuously transmitted from the virtual space management server 200 on the another user avatar disposed in the virtual space in real time.
  • the virtual space processing unit 121 may realize the audio conversation between the users by transmitting the audio of the user acquired by the sensor 140 to the virtual space management server 200 and outputting the audio of another user received from the virtual space management server 200 from the audio output unit 160.
  • the virtual space processing unit 121 generates a video (display screen) from the user viewpoint in the virtual space. Furthermore, the virtual space processing unit 121 may generate a display screen in which a user avatar operation button, a setting screen, a notification screen, a menu screen, and the like are superimposed on the video from the user viewpoint. Furthermore, the virtual space processing unit 121 may display various pieces of information acquired from the backend system 30 on the display screen. For example, the virtual space processing unit 121 may display a screen of a text chat between users performed via the text chat server 304 on the basis of the information received from the text chat server 304. The text chat screen may be updated in real time.
  • the virtual space processing unit 121 may realize an audio conversation between the users via the voice chat server 305 by transmitting the audio of the user acquired by the sensor 140 to the voice chat server 305 and outputting the audio of another user received from the voice chat server 305 from the audio output unit 160.
  • the video from the user viewpoint in the virtual space may be generated and transmitted (streamed) by the virtual space management server 200, and the virtual space processing unit 121 may display the video received from the virtual space management server 200 on the display unit 150, so that the user may be provided with the virtual space experience.
  • the display control unit 122 performs control to display an image on the display unit 150.
  • the display control unit 122 performs control to display the display screen generated by the virtual space processing unit 121 on the display unit 150.
  • the operation input unit 130 receives an operation instruction by the user to output the content of the operation to the control unit 120.
  • the operation input unit 130 may be, for example, a touch sensor, a pressure sensor, or a proximity sensor.
  • the operation input unit 130 may have a physical configuration such as a button, a switch, and a lever. The user can operate the user avatar in the virtual space using the operation input unit 130.
  • the sensor 140 has a function of detecting (acquiring) various types of information regarding the user or around the user.
  • the sensor 140 shown in Fig. 6 may include a number of sensors.
  • the sensor 140 may be a microphone (microphone) that collects sound.
  • the sensor 140 may be a camera that images the user or the periphery of the user.
  • the sensor 140 may be a positional information measurement unit that measures the position (absolute position or relative position) of the user.
  • the senor 140 may be various sensors (camera, acceleration sensor, angular velocity sensor, geomagnetic sensor, infrared sensor, depth sensor, and biometric sensor) that detect a facial expression, an emotion, a line-of-sight directions, a posture, a limb movement, a head direction, biometric information, and the like of the user. Furthermore, the sensor 140 may include a sensor capable of detecting a total of nine axes including a three-axis gyro sensor, a three-axis acceleration sensor, and a three-axis geomagnetic sensor.
  • the information detected by the sensor 140 may be used as information regarding a user operation in the virtual space processing unit 121.
  • the virtual space processing unit 121 may control the user avatar in the virtual space according to the information detected by the sensor 140.
  • the display unit 150 has a function of displaying an image under the control of the display control unit 122.
  • the display unit 150 may be a display panel such as a liquid crystal display (LCD) or an organic electro luminescence (EL) display.
  • LCD liquid crystal display
  • EL organic electro luminescence
  • the display unit 150 may be realized by an HMD that covers the entire field of view of the user.
  • the display unit 150 displays the left-eye image and the right-eye image on the left and right screens respectively fixed to the left and right eyes of the user.
  • the screen of the display unit 150 includes, for example, a display panel such as a liquid crystal display (LCD) or an organic EL display, or a laser scanning display such as a retina direct drawing display.
  • the display unit 150 may include an imaging optical system that enlarges and projects the display screen to form an enlarged virtual image having a predetermined view angle on the user's pupil.
  • the audio output unit 160 outputs audio information under the control of the control unit 120.
  • the audio output unit 160 may be configured as a headphone worn on the head of the user, or may be realized by an earphone or a bone conduction speaker.
  • the storage unit 170 is realized by a ROM that stores programs, operation parameters, and the like used for processing of the control unit 120, and a RAM that temporarily stores parameters and the like that change as appropriate.
  • the storage unit 170 stores, for example, a client application that executes various processes related to the virtual space.
  • the storage unit 170 may store user information such as a user profile and user avatar information, and virtual space information received from the virtual space management server 200.
  • the configuration of the client terminal 10 has been specifically described above, the configuration of the client terminal 10 according to the present disclosure is not limited to the example illustrated in Fig. 6.
  • the client terminal 10 may be realized by a plurality of devices.
  • Each configuration included in the client terminal 10 is not limited to being integrally provided in one housing, and they may be communicably connected by wire or wirelessly.
  • the client terminal 10 may be realized by a system configuration including an output device (corresponding to at least the display unit 150 and the audio output unit 160) realized by an HMD or the like and a processing device (corresponding to at least the control unit 120) realized by a smartphone, a tablet terminal, a PC, or the like.
  • an output device corresponding to at least the display unit 150 and the audio output unit 160
  • a processing device corresponding to at least the control unit 120
  • the client terminal 10 may be a non-wearable device such as a smartphone, a tablet terminal, or a PC.
  • the client terminal 10 may use, as the information regarding the user operation, information received from an external device such as a controller held by the user, a sensor worn by the user, or a sensor disposed around (on the environment of) the user.
  • an external device such as a controller held by the user, a sensor worn by the user, or a sensor disposed around (on the environment of) the user.
  • the client terminal 10 may be communicably connected to an external display device such as a projector, a TV device, or a display, and display the video of the virtual space by the display control unit 122.
  • an external display device such as a projector, a TV device, or a display
  • Fig. 7 is a sequence diagram illustrating an example of a flow of connection processing to the virtual space according to an embodiment of the present disclosure.
  • the client terminal 10 activates a client application (step S103). Operation processing of the client terminal 10 described below is executed by a client application.
  • the function of the client application includes the function of the virtual space processing unit 121 described above.
  • the client terminal 10 requests the user information management server 301 for user authentication (step S106). Specifically, the client terminal 10 transmits the user ID, the authentication information, and the like to the user information management server 301.
  • the user information management server 301 performs user authentication in response to a request from the client terminal 10 (step S109). In a case where the user authentication is successful, the operation proceeds to the next process. In a case where the user authentication fails, error processing is performed.
  • the user information management server 301 transmits the user information stored in the data store to the client terminal 10 (step S112).
  • the user information includes a profile of the user, information regarding the user avatar (appearance information, status information), information regarding an item owned by the user, and the like.
  • the client terminal 10 sets the received user information in the client application (step S115).
  • the user environment can be obtained regardless of which client terminal 10 the user uses.
  • the latest user environment can be reflected in the use terminal regardless of whether the user accesses from the smartphone or the home PC.
  • the client terminal 10 inquires of the matching server 303 about the connection destination server (step S118).
  • the matching server 303 performs a matching process between the client terminal 10 and the connection destination server (virtual space management server 200) (step S121). Specifically, the matching server 303 determines the virtual space management server 200 having a free space as the connection destination server according to the current number of terminals connected to each virtual space management server 200. In addition, the matching server 303 may determine the virtual space management server 200 to which the client terminal 10 corresponding to the user ID has been previously connected as the connection destination server according to the user ID.
  • the matching server 303 transmits data (for example, an internet protocol (IP) address) indicating the connection destination server to the client terminal 10 (step S124).
  • IP internet protocol
  • the client terminal 10 makes a connection request of the matched virtual space management server 200 (step S127), establishes communication with the virtual space management server 200, and starts bidirectional communication (step S130).
  • the client terminal 10 can receive the virtual space information (map information, virtual objects, information regarding other user avatars, and the like) from the virtual space management server 200 and reproduce the virtual space locally.
  • the reproduction of the virtual space can be performed by disposing the virtual object on the basis of the map information.
  • the client terminal 10 transmits user information, user operation information, and the like to the virtual space management server 200.
  • the establishment of communication with the virtual space management server 200 may be a login to the virtual space.
  • the client terminal 10 can store data of the connection destination server and log in to the same virtual space again, for example, even when temporarily logging off from the virtual space.
  • the user avatar is disposed at a predetermined position in the virtual space (for example, the start area 410 illustrated in Fig. 10).
  • the virtual space processing unit 121 of the client terminal 10 generates a video including the user avatar in the view angle as the video from the user viewpoint, and displays the video on the display unit 150.
  • the virtual space processing unit 121 may include, in the display screen, a setting screen of the user avatar, various operation buttons for operating the user avatar, a display button of a text chat being performed in the vicinity, a follow and follower display button, a notification display button to the user, a map display button, an owned item display button, and the like.
  • Fig. 8 is a diagram illustrating an example of a display screen according to an embodiment of the present disclosure. As illustrated in Fig. 8, a video 511 of the virtual space and an operation screen 512 are displayed on the display screen 510. An image of the user avatar, buttons for editing the user avatar and the user profile, an area-specific menu display button 513, and an area common menu button 514 are displayed on the operation screen 512.
  • the area-specific menu display button 513 is a button for displaying a menu specific to an area (virtual space) where the user avatar currently exists.
  • an area specialized for the content is assumed, and an experience menu related to corresponding content is assumed as the area-specific menu.
  • the content here may be an intellectual property (IP) content.
  • IP intellectual property
  • an area of a specific soccer team is assumed as an example of such an area.
  • an interaction between fans of a specific soccer team is performed.
  • a logo of the soccer team, a goods store, a virtual stadium in which a stadium existing in the real space is reproduced, and the like are disposed. The user can also move to an area specialized for another content.
  • the client terminal 10 may newly acquire data of the connection destination server (which provides an area (virtual space) specialized for another IP content) from the matching server 303. Further, one virtual space management server 200 may provide a plurality of different IP content areas. In this case, the client terminal 10 can move the area without switching the connection destination server.
  • the connection destination server which provides an area (virtual space) specialized for another IP content
  • one virtual space management server 200 may provide a plurality of different IP content areas. In this case, the client terminal 10 can move the area without switching the connection destination server.
  • the area common menu button 514 is a menu button that can be used in any IP content area.
  • the avatar matching button is a button for matching with another user avatar in the area. Another user avatar with similar user characteristics may be presented as a recommendation for the interaction partner.
  • a setting button of an emote which is an action performed by the avatar to further enrich the emotional expression of the avatar, and a stamp which is a small image temporarily displayed above the head of the avatar to express the feeling of the avatar is displayed.
  • Fig. 9 is a diagram illustrating an example of a display screen according to an embodiment of the present disclosure.
  • a display screen 520 displays a video of the virtual space and various operation buttons.
  • a user avatar U1 is in front of the stage where a large display 529 is disposed.
  • a real space relay video is displayed on the large display 529.
  • the content distribution server 306 acquires the relay video from an external server (relay video distribution server), and distributes the relay video to each client terminal 10 in real time.
  • a controller 521 for controlling the movement of the user avatar in the virtual space (area) is displayed at the left end of the display screen 520, and a jump icon 522 is displayed at the right end of the display screen 520.
  • the virtual space processing unit 121 causes the user avatar to walk forward, backward, left, and right in accordance with the sliding direction. Note that, in a case where the slide amount or the slide speed is equal to or more than the setting value, the virtual space processing unit 121 causes the user avatar to run forward, backward, left, and right.
  • the jump icon 522 is tapped, the user avatar jumps.
  • Such user operation information regarding the motion such as the movement or the jump of the user avatar is transmitted to the client terminal 10 of each of the other users via the virtual space management server 200 in real time, and is reflected in the user avatar disposed in the virtual space reproduced in the client terminal 10 of each of the other users. As a result, the motion of the user avatar is shared with the other users.
  • the user can move the user's line-of-sight in the virtual space by dragging an any place (a place other than that of the operation button) on the display screen 520 up, down, left, and right with one finger or the like. Furthermore, the user can reduce or enlarge the video of the virtual space by pinching in or pinching out an any place (a place other than that of the operation button) on the display screen 520 with two fingers.
  • a microphone icon 523 is a button for switching input ON and OFF of the audio of the user according to a tap operation.
  • the audio of the user is transmitted to another user via the virtual space management server 200 or the voice chat server 305.
  • the voice chat may be performed with another user avatar located around the user avatar, may be performed with a specific another user avatar permitted by the user, or may be performed between participating users of the voice chat group.
  • a text input icon 524 is a button for accepting input of a text chat.
  • a text input UI user interface
  • the input text is displayed as a balloon image T1 above the head of the user avatar U1 as illustrated in Fig. 9, for example.
  • a balloon image T2 is also displayed above the head of another user avatar U2 near the user avatar U1.
  • a 3D representation may be used to display the balloon images of other user avatars U2 and U3 located at both sides of the user avatar U1.
  • the virtual space processing unit 121 may blank the balloon images of other user avatars U4 and U5 located at positions distant from the user avatar U1 so as not to be visually recognized by the user.
  • the text chat is not limited to a method via a balloon image displayed above the head of the user avatar.
  • the virtual space processing unit 121 may display the surrounding chat window in response to a tap operation of a surrounding chat window display switching button 527.
  • the surrounding chat window is a screen that displays a text chat between user avatars around the user avatar (for example, in a section where the user avatar is located). As a result, the user can perform text chat with another user in the vicinity.
  • the text chat can be performed via the virtual space management server 200 or the text chat server 304.
  • a stamp icon 525 is a button for displaying a stamp menu screen. The user can select an any stamp from the stamp menu screen displayed by tapping the stamp icon 525 and display the stamp above the head of the user avatar U1.
  • An emote icon 526 is a button for displaying an emote menu screen. The user can select an any emote from the emote menu screen displayed by tapping the emote icon 526, and can operate a facial expression, a pose, a gesture, and the like of the user avatar U1.
  • IP content area examples of the virtual space according to an embodiment of the present disclosure include an area specialized for the IP content.
  • the area may be an area with an interaction between fans of a specific soccer team as a concept.
  • a logo of the soccer team, a goods store, a virtual stadium reproducing a stadium existing in the real space, and the like are disposed at various places in the area.
  • Fig. 10 is a diagram illustrating an example of a simple map of an IP content specific area according to an embodiment of the present disclosure. As shown in Fig.
  • an area 400 includes a start area 410 in which a user avatar is first placed when entering the area, a goods store 420, a stadium 430, and a bar 440.
  • the area 400 may be a mirror world in which respective virtual facilities (the goods store 420, the stadium 430, and the like) to be disposed are similar to those of the real space. The user can enjoy an atmosphere around the actual stadium by moving in the area 400 with the user avatar.
  • a game of a soccer team is played back as an example of the content distribution in the virtual space.
  • the game video may be displayed on a large display disposed in the stadium 430 (streaming distribution of 2D moving images), or a soccer player and a ball may be reproduced by a 3DCG, and each soccer player and the ball may be moved on the basis of motion data in an actual game to reproduce the game.
  • Fig. 11 is a diagram for describing a mechanism of game reproduction using bone data of a player according to an embodiment of the present disclosure.
  • the specification information regarding the distribution content is transmitted from a virtual space management server 200a to each client terminal 10.
  • the distribution content specification information is information indicating the game content reproduced in the stadium 430.
  • Each client terminal 10 requests the content distribution server 306 of the backend system 30 to distribute the game content on the basis of the specification information received from the virtual space management server 200a.
  • the content distribution server 306 requests a game data server 50, which is an external device, to distribute the game content.
  • the game data server 50 is a device that stores tracking data of each player and the ball collected by a tracking system 51 in a game performed in the real space.
  • the tracking data of each player is, for example, bone data.
  • the game data server 50 may store the 3DCG of each player.
  • the game data server 50 generates game content data after adjusting the format of the tracking data acquired from the tracking system 51 and appropriately deleting unnecessary data.
  • the game data server 50 transmits the game content data to the content distribution server 306 in response to a request from the content distribution server 306.
  • the content distribution server 306 transmits the game content data to each client terminal 10.
  • the game content data includes a 3DCG (also referred to as a player object) of each player, bone data of each player in the game, a 3DCG of the ball, tracking data of the ball in the game, and the like.
  • the client terminal 10 moves the player object in the stadium 430 on the basis of the bone data, and can more realistically reproduce the game in 3D.
  • the user can operate the user avatar to enter the stadium 430 and enjoy the game in the stadium 430. Furthermore, the user can move the user avatar into the field (here, in the soccer coat,) and watch the movement of each player from various angles nearby.
  • the content distribution server 306 can also synchronize the timing of the game content data reproduced in each client terminal 10 (the playback time of the game reproduced in the stadium 430).
  • a UI when reproducing a game in the virtual space using the tracking data of the game performed in the real space will be described later with reference to Figs. 27 to 32.
  • a communication point associated with a communication room in which communication between users is performed is set.
  • the communication point may be set by the virtual space management server 200 or by the user.
  • the communication room communication by one or more different communication means can be performed.
  • the one or more different communication means are, for example, communication means using information such as an audio, a character, or an image.
  • the communication room may be provided by the backend system 30 (for example, the talking table server 307).
  • the user can enter the communication room (participate in communication) by moving the user avatar to the communication point where the talking table as the virtual object is displayed.
  • the display screen transitions to a screen of the communication room.
  • communication performed in the communication room can be performed by corresponding communication servers (for example, the text chat server 304 and the voice chat server 305).
  • a virtual object serving as a mark may be disposed at the communication point.
  • a table is used as an example of a virtual object serving as a mark. Such a table is also referred to as a talking table in the present specification.
  • the talking table is disposed at an any place (communication point setting location) in the virtual space by an input operation to the client terminal 10 operated by the user or the virtual space management server 200. As an example, it may be disposed in the bar 440 described above with reference to Fig. 10.
  • Fig. 12 is a diagram illustrating a display screen inside a bar at which a talking table 600 according to an embodiment of the present disclosure is disposed.
  • a display screen 530 illustrated in Fig. 12 is an example of a video from the user's line-of-sight when the user avatar moves to the inside of the bar 440.
  • the bar 440 is provided as a place where users (here, fans of a soccer team) interact with each other.
  • a plurality of talking tables 600 is provided inside the bar 440.
  • a talking room is associated with each of the talking tables 600.
  • the talking room is provided by the talking table server 307.
  • the shape of the talking table 600 is an example.
  • information regarding selection of entry to the talking room associated with the talking table 600 is transmitted from the virtual space management server 200 to the client terminal 10.
  • the user can cause the user avatar to enter (in other words, participate in the talking table) the talking room (communication room) by performing the operation of the room entry request on the client terminal 10.
  • the talking table 600 is displayed at a communication point associated with a communication room in which communication between users is performed.
  • the communication point is set in a specific region on the virtual space, and in a case where the avatar is included in the communication point, information regarding the room entry selection is transmitted.
  • Fig. 13 is a diagram for explaining an example of the shape of the talking table 600.
  • the talking table 600 includes a table 601 and a display 602.
  • the display 602 is an example of a display object that displays “information regarding an event taking place in a real space or a virtual space” that is a subject of conversation in a talking room (in a communication room) associated with the talking table 600.
  • the subject of conversation in the talking room may be preset in the virtual space management server 200, or may be a current topic in the talking room or a video currently viewed in the talking room (for example, a topic window 542 illustrated in Fig. 20).
  • the display content of the display 602 is information regarding the subject of conversation in the talking room, and may appropriately change according to a change in the subject of conversation in the talking room.
  • the display 602 and the table 601 are disposed in the virtual space as virtual objects.
  • the 2D moving image displayed on the display 602 may be a moving image streamed from the content distribution server 306.
  • the specification of the 2D moving image displayed on the display 602 can be performed by the virtual space management server 200.
  • the virtual space management server 200 may specify the 2D moving image to be displayed on the display 602 according to the IP content associated with the area. The same 2D moving image may be played back at the plurality of talking tables 600 disposed inside the bar 440.
  • the talking table 600 may further include a display 603 that displays information regarding the talking table 600.
  • the display 603 displays the identification number of the talking table 600 and the number of people currently entering the talking room. By visually recognizing the display 603, the user can check how many people can enter the talking room and how many people are currently in the talking room.
  • the “participation as a speaker” is participation in a state in which speech (for example, voice chat or text chat) can be made in the talking room.
  • the participation as the audience is participation in a state in which speech in the talking room is not allowed but viewing in the talking room is allowed.
  • the virtual space management server 200 transmits, to the client terminal 10, information regarding room entry selection for the user to select whether or not to enter the talking room associated with the talking table 600.
  • the information regarding the room entry selection is information indicating that the user can enter (participate in) the room in either form of a speaker or an audience.
  • the user operates the room entry (participation) request in the client terminal 10, the user can select whether to participate in either form of a speaker or an audience.
  • the virtual space processing unit 121 displays a selection screen for allowing the user to select whether to enter the talking room as a speaker or as an audience.
  • the participation/non-participation as the audience may be set at the time of creating the talking room.
  • each session information (connection information) corresponding to the talking table 600 may be included in advance in the information regarding the room entry selection.
  • the client terminal 10 issues a room entry request to the virtual space management server 200 in response to an operation of room entry selection by the user, and can establish communication connection with the talking table server 307, the text chat server 304, the voice chat server 305, and the like on the basis of each section information.
  • the user may not know which of the talking tables 600 the user can participate in to perform desirable communication. Therefore, for example, the following method can be considered.
  • Fig. 14 is a diagram for describing a case where the state in the talking room is disclosed according to the distance to the talking table 600 according to an embodiment of the present disclosure.
  • the virtual space management server 200 determines that the user avatar is located within an area E1 (second area) within a first distance (within the first distance centered on the communication point) from the position of the talking table 600, the virtual space management server 200 transmits information regarding communication of the talking table 600 to the client terminal 10.
  • the information regarding communication of the talking table 600 there is audio information in the communication room associated with the talking table 600.
  • the user can grasp the amount of conversation, atmosphere, and the like in the talking room without entering the talking room.
  • the users of the user avatars U4 and U5 are in a state of hearing a conversation in the talking room.
  • the virtual space management server 200 may instruct the client terminal 10 to increase the volume so that the conversation in the talking room can be heard as the user avatar approaches the center (the talking table 600).
  • the virtual space management server 200 determines that the user avatar is located in an area E2 (first area) within a second distance shorter than the first distance from the position of the talking table 600, the virtual space management server 200 transmits, to the client terminal, information regarding selection of entry to the talking room associated with the talking table 600.
  • the user can perform an operation of the room entry request on the client terminal 10.
  • the user can select whether to participate in (enter) the talking room in either form of a speaker or an audience (this can also be said to be a user input for information regarding room entry selection).
  • the virtual space management server 200 may transmit, to the client terminal 10, a conversation (for example, voice chat information and text chat information) in the talking room, a video (for example, a state of the user avatar illustrated in Fig. 20) in the talking room, a video shared (viewed) in the talking room, or an agenda in the talking room.
  • the agenda may be extracted from an analysis of a voice chat or a text chat in a talking room. Such extraction of the agenda may be performed by the backend system 30.
  • a category of the topic window 542 picked up in a talking room to be described later may be extracted as the agenda.
  • the conversation in the talking room, the video in the talking room (for example, the state of the user avatar illustrated in Fig. 20), the video shared (viewed) in the talking room, or the agenda in the talking room may be transmitted to the client terminal 10 corresponding to the user avatar located in the area E1.
  • the user avatar U1 and the user avatar U2 enter the talking room as speakers, and the user avatar U3 enters the talking room as an audience.
  • the user avatar U4 and the user avatar U5 located in the area E1 can recognize the topic of the conversation being held in the talking room by visually recognizing the display content of the display 602.
  • information regarding communication in the talking room may be transmitted from the virtual space management server 200 to the client terminal 10 of each of users who operates the user avatar U4 and the user avatar U5.
  • a selection screen as to whether or not to enter the talking room is presented.
  • the virtual space management server 200 arranges, in the virtual space, the virtual object (the talking table 600) including the display object (the display 602) that displays “information regarding the event taking place in the real space or the virtual space” that is the subject of conversation in the talking room (in the communication room).
  • the user who operates the user avatar existing around the virtual object can recognize the topic of the conversation being held in the talking room by visually recognizing the display content of the display object.
  • Fig. 15 is a view illustrating a display screen of the terminal (client terminal 10) used by the user who operates the user avatar in a case where the position in the virtual space of the user avatar operated by the user is included in the area E2 (first area) that is a communication point associated with a communication room in which communication between the users is performed, the virtual space management server 200 transmits information regarding selection of entry to the communication room to the terminal (client terminal 10) used by the user to display a selection screen SC on the display screen.
  • Fig. 16 is a view illustrating a display screen of the terminal (client terminal 10) used by the user who operates the user avatar in a case where the position of the user avatar is included in the area E1 (second area) located outside the area E2 (first area), the virtual space management server 200 transmits information regarding communication to the terminal (client terminal 10) used by the user who operates the user avatar.
  • audio information AI in the communication room associated with the talking table 600 may be output by the client terminal 10.
  • the user can determine the relationship with respect to the talking room by the avatar operation, such as grasping the topic in the talking room by visually recognizing the display object (display 602), checking the audio in the talking room according to the distance between the user avatar operated by the user and the virtual object (the talking table 600), or participating in the talking room by bringing the user avatar closer to the virtual object by a certain amount or more.
  • the virtual space processing unit 121 may change the disclosure level of the state in the talking room according to the liking/preference information regarding the user. For example, in a case where the conversation (agenda) in the talking room matches the user's liking/preference, the virtual space processing unit 121 may play back the conversation in the talking room with a large sound or display a more detailed agenda on a tab display 605. On the other hand, in a case where the conversation (agenda) in the talking room does not match the user's liking/preference, the virtual space processing unit 121 may play back the conversation in the talking room with a small sound or may not play back the conversation, or may not display the agenda on the tab display 605.
  • the virtual space processing unit 121 may highlight the talking table 600 of the agenda matching the user's liking/preference. Furthermore, the virtual space processing unit 121 may highlight the talking table 600 of the agenda that the user has mentioned in the conversation so far.
  • the virtual space processing unit 121 may highlight the talking table 600 in which another user estimated to have liking/preference matching with the user participates.
  • the virtual space processing unit 121 may present an effect below the table according to the excitement in the talking room.
  • Fig. 17 is a diagram illustrating an example of effect display of the talking table 600 according to the excitement in the talking room according to an embodiment of the present disclosure. As illustrated in Fig. 17, the virtual space processing unit 121 may present a write effect below the table 601 included in the talking table 600. The virtual space processing unit 121 may control the color, blinking, density, and the like of the effect according to the excitement in the talking room. As a result, the user can grasp the excitement in the talking room.
  • the virtual space processing unit 121 may change the shape of the talking table 600 according to the number of participants in the talking table 600 (the number of people entering the talking room). For example, the table may be changed to a larger table as the number of participants increases.
  • the host of the talking room may not be particularly set, and the user who enters the room first may be used as the host.
  • the creator may be a host.
  • the host may restrict entry to the talking room, or the host may permit entry to the talking room.
  • the setting of the host can be performed by the talking table server 307 that manages the talking room.
  • only the host may be given the authority to leave the another participant.
  • the authority of the host may be arbitrarily transferable to another participant.
  • the virtual space management server 200 may display a specific scene of a game taking place in the real space or the virtual space on the display 602 as information regarding the event in the real space or the virtual space.
  • the information regarding the event taking place in the real space is not limited to the specific scene of the game, and is only desirable to be, for example, a specific scene of an event such as a specific scene of a music live show performed in the real space.
  • a specific scene of an event such as a music live show or an e-sports tournament performed in the virtual space.
  • the specific scene of the event may be, for example, a highlighted video of a game in real space.
  • the virtual space management server 200 may perform control to display the highlighted video of the game taking place in the real space on the display 602 (display object) as the information regarding the event in the real space.
  • the highlighted video of the game for example, a goal scene, a foul scene, or the like is assumed.
  • the specific scene of the event may be a highlighted video of the event taking place in the virtual space.
  • the virtual space management server 200 may use only the information regarding the team that the specific user group supports in the event as the “information regarding the event in the real space” to be displayed on the display 602 (display object). For example, it may be a highlighted video of a game by a specific team. As a result, the user can grasp about which team communication is performed in the talking room associated with the talking table 600 including the display 602 before entering the room. In the present system, for example, only information regarding a specific team is displayed on the display 602, so that fans of the specific team can gather in the talking room.
  • the virtual space management server 200 may transmit the information regarding the room entry selection to the talking room to only users belonging to a specific user group in the event among the users who operate the user avatars approaching the talking table 600. As a result, in the present system, it is possible to limit the user group who can enter the talking room.
  • FIG. 18 is a sequence diagram illustrating an example of operation processing from generation of the talking table 600 to entry according to an embodiment of the present disclosure.
  • the talking table server 307 generates a talking room in response to a request for generating a talking room (step S203) from the virtual space management server 200 (step S206). Then, the talking table server 307 transmits the generated session information to the talking room to the virtual space management server 200 (step S209).
  • the virtual space management server 200 associates the session information with the communication point set in the virtual space. Note that, here, the generation request from the virtual space management server 200 will be described as an example, but the present disclosure is not limited thereto, and the client terminal 10 may make a generation request.
  • the talking table server 307 transmits the generated session information to the talking room to the client terminal 10.
  • the client terminal 10 associates the session information with the communication point set in the virtual space. Furthermore, the client terminal 10 transmits the communication point and the session information set in the virtual space to the virtual space management server 200 and shares the communication point and the session information with other users.
  • the virtual space management server 200 transmits a text chat room generation request to the text chat server 304 (step S212).
  • the text chat server 304 generates a text chat room (step S215), and transmits session information to the text chat room to the virtual space management server 200 (step S218).
  • the virtual space management server 200 transmits a voice chat room generation request to the voice chat server 305 (step S221).
  • the voice chat server 305 generates a voice chat room (step S224), and transmits session information to the voice chat room to the virtual space management server 200 (step S227).
  • the virtual space management server 200 stores each session information (step S230).
  • the client terminal 10 transmits an entry request to the virtual space management server 200 (step S233).
  • the client terminal 10 may display the room entry selection screen on the basis of the information regarding the room entry selection.
  • the virtual space management server 200 recognizes the intention of the user to enter the room by receiving the room entry request from the client terminal 10 (step S236), and transmits each session information (connection information) corresponding to the talking table 600 to the client terminal 10 (step S239). Furthermore, the virtual space management server 200 may cause the client terminal 10 to transition to a talking room screen (an example of a communication room screen) when the user enters the talking room.
  • a talking room screen an example of a communication room screen
  • an entry request is transmitted to the virtual space management server 200 in a case where the user selects entry, but the present disclosure is not limited thereto.
  • the client terminal 10 may request the virtual space management server 200 to transmit each session information before obtaining the intention of the user to enter the room.
  • the client terminal 10 since the client terminal 10 continuously transmits the position of the user (the position of the user avatar) to the virtual space management server 200, in a case where the user avatar is positioned within a certain distance from the talking table 600 (communication point), the virtual space management server 200 may recognize that there is an intention to enter the room to transmit each session information to the client terminal 10. Alternatively, the virtual space management server 200 may include each session information corresponding to the talking table 600 in the information regarding the selection of entry to be transmitted to the client terminal 10. The client terminal 10 may cause the user to select whether or not to enter the room after receiving each session information from the virtual space management server 200. As a result, the client terminal 10 can perform control to disclose the state in the talking room according to the distance between the talking table 600 and the user avatar before obtaining an intention to enter the talking room from the user.
  • the client terminal 10 displays the talking room screen, and starts communication connection with each server on the basis of each session information (step S242). Details of the communication connection with each server will be described next with reference to Fig. 19.
  • Fig. 19 is a diagram illustrating details of communication connection between the client terminal 10 and each server at the time of entering the talking room according to an embodiment of the present disclosure.
  • each client terminal 10 that has entered the talking room performs bidirectional communication with the talking table server 307 to acquire room entry user information.
  • the room entry user information includes the user ID of the users entering the room, the user name, information regarding the user avatar, and the like.
  • Each client terminal 10 performs bidirectional communication with the text chat server 304 to transmit and receive text chat. This allows a text chat to be performed in the talking room.
  • Each client terminal 10 performs bidirectional communication with the voice chat server 305 to transmit and receive a voice chat. As a result, a voice chat can be performed in the talking room.
  • Each client terminal 10 performs bidirectional communication with the virtual space management server 200a to transmit and receives an emote, a stamp, and the like in real time. This controls the emote and the stamp of each user avatar displayed in the talking room in real time.
  • Each client terminal 10 is communicably connected to the content distribution server 306, and receives streaming distribution of the content (moving image or the like). As a result, the content (moving image or the like) can be viewed in the talking room.
  • the content specification information can be received from the talking table server 307 or the virtual space management server 200a.
  • the content may be set by the talking table server 307 or the virtual space management server 200a, or may be set by the users entering the room.
  • FIG. 20 is a diagram illustrating an example of a talking room screen according to an embodiment of the present disclosure.
  • the virtual space processing unit 121 of the client terminal 10 causes a screen to transition from a display screen as illustrated in Fig. 14 to a talking room screen 540 as illustrated in Fig. 20.
  • the user avatars entering the room are displayed side by side.
  • various operation buttons a stamp icon 546, an emote icon 547, a stamp icon 548, a text input button 549, a microphone ON-OFF button 550, a menu button 541
  • the topic window 542 is displayed.
  • a balloon image 544 displaying a text input and a stamp image 545 selected by the user may be displayed.
  • the virtual space processing unit 121 may display the balloon image 544 and the stamp image 545 only for several seconds so that the log is not viewed.
  • a facial expression, a pose, a gesture, or the like is controlled according to the emote selected by the user.
  • the topic window 542 is a virtual object that displays the content viewed in the talking room. There may be a plurality of topic windows 542. Each user may slide a plurality of topic windows 542a to 542c to select the content to pick up. Within the talking room, playback of the picked-up topic window 542 (for example, the centrally located window) begins. Note that the right to pick up (channel right) can be given to all users entering the room.
  • the category of the content displayed in the topic window 542 is assumed to be, for example, news related to IP content (here, a specific soccer team) (top 5 page views (PV) in the last 3 days, etc.), a reverberant video among moving images related to IP content (video with large number of views, video with excitation of user, etc.), a highlighted video of the latest game, a battle history, and the like. Furthermore, in the topic window 542, it is also possible for the user entering the room to display private photographs and videos or share the user's local screen. Furthermore, the category of the content displayed in the topic window 542 may be determined by the virtual space management server 200 or the talking table server 307 according to the place where the talking table 600 is installed or the current time.
  • Fig. 21 is a diagram illustrating another example of the talking room screen according to an embodiment of the present disclosure.
  • a talking room screen 560 of Fig. 21 only a picked-up topic window 561 may be displayed in a large size, and a text chat window 562 may be displayed.
  • the display ON-OFF switching of the text chat window 562 can be performed by a tap operation on a display switching button 564.
  • ON-OFF switching of the microphone in the talking room can be performed by a tap operation on a voice switching button 565.
  • the talking room screen is not limited to the example illustrated in the drawings.
  • Event coordination Next, the control of performance (including the virtual event) in the virtual space according to the development of the event taking place in the real space, where the performance is performed in cooperation with the event coordination server 308, will be described.
  • the distribution video of the event taking place in the real space may be disposed in the virtual space.
  • the user may view the distribution video of the event taking place in the real space outside the virtual space, such as a television or a window different from a window in which the virtual space is displayed.
  • the performance in the virtual space is determined according to the development of the event taking place in the real space and the reactions of the plurality of users.
  • Fig. 22 is a diagram illustrating a mechanism of event coordination according to an embodiment of the present disclosure.
  • a soccer game performed in real time in real space is assumed as the predetermined event.
  • the event coordination server 308 of the backend system 30 acquires real time game data at appropriate times and analyzes the game development. In addition, the event coordination server 308 identifies whether the game is to be developed before, during, or after the game as the game development. For example, the event coordination server 308 may acquire the text of the game development as game data from a service that livestreams the game development as text, and automatically extract a specific word. Examples of the service of distributing the game development as text live include a service of distributing the comment (audio) of the commentator of the game relay in text.
  • Examples of the specific word to be automatically extracted include “free kick”, “foul”, “missing shoot”, “shooting stopped”, “corner”, “free kick by the enemy”, “goal by *** (enemy team)”, “start of half time”, “player change”, “start of second half”, “yellow card”, “off side”, “goal by *** (ally team)”, and “end of game”.
  • the text of the game development may be distributed independently on the official site of each team.
  • the event coordination server 308 may perform image recognition on the relay video of the game in real time and determine the play.
  • the virtual space management server 200a specifies corresponding predetermined performance according to the analysis result of the game development acquired from the event coordination server 308 at appropriate times and the reactions of the plurality of users, and instructs each client terminal 10 to immediately carry out the performance in the virtual space.
  • performance according to the development of a game performed in the real space can be carried out in real time in the stadium 430.
  • the users who gather in the stadium 430 existing in the virtual space can enjoy watching the game in the virtual space by experiencing performance in the stadium 430 with other users according to the game development and reactions of a plurality of users.
  • the virtual space management server 200a may acquire the distribution video of the game taking place in the real space from the backend system 30 (for example, the content distribution server 306), transmit the distribution video to each client terminal 10, and instruct the each terminal to arrange the distribution video in the virtual space (specifically, in the stadium 430,).
  • each client terminal 10 may acquire the distribution video of the game specified by the virtual space management server 200a from the content distribution server 306.
  • the user may view the performance in the stadium 430 from the user viewpoint in the virtual space by the client application activated on the client terminal 10 such as the smartphone or the tablet terminal while viewing the live broadcast of the game on the external device such as the television device or the PC.
  • the client terminal 10 may display the live broadcast of the game in one window displayed on the display unit 150, and may display the video of the virtual space (the state of performance in the stadium 430) in another window displayed on the display unit 150.
  • the virtual space management server 200a acquires the reaction of the user from each client terminal 10, estimates the game development on the basis of the content, and determines the corresponding performance.
  • the user's reaction include user operation information (stamp, the number of emotes performed, type, and the like).
  • the user's reaction may be information such as the number of voice chat transmissions, a content, a frequency, and an audio volume.
  • the user's reaction may be information such as the number of text chat transmissions, a content, and a frequency.
  • the virtual space management server 200a can also estimate the game development from the number of transmissions of voice chat or text chat, or the like. Note that the estimation of the game development based on the reaction of the user may be performed by the event coordination server 308.
  • Examples of the performance corresponding to the game development include performance in which a DJ appears on a stage disposed in a field to excite the venue with music before the game, display of a video during practice or moving of the team before the game on a large display disposed in the field, and display of a relay video of a scene appearing in the game venue.
  • the starting member announcement performance may be carried out using a large display disposed in the field.
  • examples thereof can include a kickoff performance at the start of the game, and during the game, a foul performance of an own team player or a counterpart team player, a lost score performance (performance to encourage players by the avatar of the team character appearing), a score performance (performance of joy by the enjoying avatar of team character while playing music, setting off fireworks, and causing confetti and balloons to appear), a lottery event performance in halftime (present is dropped), a profile display performance of a star player (displayed on a large display disposed in the field), a yellow card performance of an own team player or a counterpart team player, and an off-side performance of an own team player or a counterpart team player.
  • examples thereof can include a game end performance at the end of the game, and a victory performance (a performance in which a DJ appears to excite a venue with music) or a defeat performance after the game.
  • the virtual space management server 200a may execute the corresponding performance according to the development of the game and the reaction of the plurality of users. For example, when many users (more than a certain number of or more than a certain percentage of users) press “GO” stamps, the virtual space management server 200a executes, as a support performance, a performance for causing a non-player character (NPC) disposed in a field to perform a cheering emote or a performance for playing back music for support. Furthermore, the virtual space management server 200a may set the number of reactions of the user as excitement, and increase the size and the number of balloons to be blown to the field according to the magnitude of the excitement.
  • NPC non-player character
  • the virtual space management server 200a may increase the volume of the music for support played back in the field according to the excitement of the user. Furthermore, in a case where the excitement of the user continues, the virtual space management server 200a may extend the length of the support performance from the predetermined time. Furthermore, the virtual space management server 200a may determine the performance on the basis of the reaction having the largest amount among the reactions of the user.
  • the virtual space management server 200a may acquire the reaction of the user who operates the user avatar existing in the virtual space managed by the virtual space management server 200a as the reaction of the user.
  • the user's reaction is, for example, user operation information (stamp, the number of emotes performed, type, and the like).
  • the user's reaction may be information such as the number of voice chat transmissions, a content, a frequency, and an audio volume.
  • the user's reaction may be information such as the number of text chat transmissions, a content, and a frequency.
  • the virtual space management server 200a may further acquire, as the user's reaction, a reaction of the user who operates the user avatar existing in another virtual space managed by another virtual space management server that carries out performance according to the game development targeting a game same as a game targeted by the virtual space management server 200a.
  • a user's reaction may also be user operation information, voice chat information, or text chat information.
  • the virtual space management server 200a may further acquire a reaction of the user who operates the user avatar existing in another virtual space managed by another virtual space management server transmitting the same distribution video.
  • the reaction of the user is not limited to the reaction of the user operating the user avatar, and may be a reaction of an audience watching a game on site in real space or a reaction of a viewer watching a game on a television or the like.
  • a reaction may be, for example, a facial expression, a motion, a cheer, or the like.
  • a reaction is sensed by various sensors such as a camera, an acceleration sensor, and a microphone.
  • the virtual space management server 200a can appropriately acquire information regarding a reaction of the audience or a reaction of the viewer in real time.
  • the virtual space management server 200a may determine the emote of the avatar existing in the virtual space as the performance to be determined according to the development of the game and the reactions of the plurality of users. For example, in a case where the virtual space management server 200a identifies that the game is being played as the game development, and further identifies that many users are performing support to encourage the game such as “GO” stamp selection, the emote in which the plurality of avatars existing in the virtual space waves a towel to support the team and the like reflected. At this time, the virtual space management server 200a may reflect the emote on the avatar in the non-operation state in the virtual space, may reflect the emote on the NPC, or may reflect the emote on all the avatars. Furthermore, the virtual space management server 200a may reflect the remote on all the avatars in the non-operation state and all the NPCs, or may reflect the remote on some of the avatars in the non-operation state and some of the NPCs.
  • the game development largely includes before the game, during the game (first half), half time, during the game (second half), and after the game.
  • first half the game, during the game
  • second half half time
  • after the game the game development largely includes before the game, during the game (first half), half time, during the game (second half), and after the game.
  • a live stadium a performance event that is carried out in accordance with a real-time development of a game in real space in the stadium 430. In an embodiment of the present disclosure, it is also referred to as a “live stadium”) is notified to each user avatar.
  • the user avatar in the area moves to the stadium 430.
  • the user enters the stadium 430, joins another user avatar (another user registered as a friend), and waits for the start of the game while talking with a friend in a group chat, for example.
  • FIG. 23 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. As illustrated in Fig. 23, on a display screen 710, the starting member information, the video of the state of the team before the game, and the like are displayed on a large display 711 disposed in the field (pitch) in the stadium in the virtual space.
  • a performance of a starting member announcement performance and a performance of making avatars of starting member players appear on the stage may be carried out.
  • the users gathering in the field are further excited.
  • a relay of the players entering the game venue is displayed on the large display 711, and the excitement of the users gathering in the field is the highest.
  • Fig. 24 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure.
  • display windows 721 and 722 of the large display 711 and a text chat window 723 are displayed on the front.
  • Such display switching can be appropriately performed by a user operation.
  • the user can use the text chat window 723 to text chat with other users in the area.
  • the user can also perform voice chat with a member in a group generated with other users registered as friends. These text chats and voice chats may be done using the backend system 30.
  • an avatar video of a team character who appears on a stage disposed in a field of the virtual space and excites the venue is displayed.
  • the same support stamp as the support stamp selected by the majority of users in the field is displayed, or a support stamp prompting the user to select is displayed. With such performance, the user can have a sense of unity, and the venue can be excited.
  • the virtual space processing unit 121 may appropriately play back music or audio in accordance with a performance instruction from the virtual space management server 200.
  • the support stamp includes, for example, NICE, BOO, GOAL, and the like.
  • Fig. 25 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure.
  • a display screen 730 illustrated in Fig. 25 illustrates an example of the performance for lost score.
  • the enemy team has scored a goal
  • a performance in which the team character takes a depressed pose is carried out.
  • the support stamp button 450 of “OMG” is displayed above the head of the user avatar U.
  • a performance in which the large display 731 display the support stamp of OMG! is displayed is carried out.
  • Fig. 26 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. As illustrated in Fig. 26, when a support button 742 of synchronization (SNYC) illustrated on a display screen 740 is selected, the virtual space processing unit 121 causes the user avatar U to perform the same cheering emote as a team character 741. As a result, the user in the venue can perform the cheering emote together with the team character and the other users.
  • SNYC support button 742 of synchronization
  • Fig. 27 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. As illustrated in a display screen 750 of Fig. 27, for example, when a large number of boxes fall from the sky and the user operates the user avatar U and picks up the box, a present such as a sign ball or a uniform is hit by lottery.
  • a score performance is carried out.
  • the score performance for example, music is played, fireworks are set off, confetti and balloons fall, and a team character dances.
  • Fig. 28 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. As illustrated in a display screen 760 of Fig. 28, on the field, a performance in which fireworks are set off and confetti and balloons fall is carried out. Furthermore, on a large display 761, information regarding a player who has scored a goal, information regarding a player of the opposing team, scores, a support stamp, a video of a team character, and the like are displayed. Furthermore, a performance in which a huge avatar 762 of the player who has scored a goal appears on the stage can also be carried out. In this way, a performance that excites the venue is carried out when the team score a goal.
  • NPCs may be disposed in the spectator seats around the field and on the field, and the performance emote may be performed by the virtual space processing unit 121 as appropriate.
  • the virtual space processing unit 121 causes the NPC to perform a cheering emote when the teammate player is on the offensive, shooting or free kicking.
  • the virtual space processing unit 121 causes the NPC to perform a joy emote.
  • the virtual space processing unit 121 causes the NPC to perform a booing emote.
  • the virtual space processing unit 121 causes the NPC to perform a despair emote.
  • the virtual space management server 200 may instruct each client terminal 10 to cause the user avatar in the non-operation state to automatically carry out a performance such as an emote or a stamp in accordance with the overall performance.
  • an end-time performance is carried out.
  • the venue may be further energized by a winning performance (such as reappearance of a DJ again), or in a case where the team loses the game, a performance honoring the players may be carried out.
  • an interview video of the manager and the player after the game may be displayed on the large display of the field.
  • a team character may appear on a stage to perform a winning emote or to play a team's cheer song.
  • users may move to a place where fans can interact with each other, such as the bar 440 described with reference to Figs. 10 and 12, and talk about the game today using the talking table 600.
  • today's game video may be shared.
  • the performance according to the present disclosure is not limited to the above-described specific example, and various performances can be considered. Furthermore, here, the performance is carried out according to the game development as an example, but the present disclosure is not limited thereto, and for example, the performance may be carried out according to the development of a music live concert, a lecture, a recital, or the like performed in the real space.
  • the client terminal 10 reproduces the game by disposing each player object and the ball in the stadium 430 (see Fig. 10) in the virtual space on the basis of the game content data received from the content distribution server 306, moving each player object according to the bone data of each player tracked during the game, and moving the virtual ball on the basis of the tracking data of the ball.
  • the client terminal 10 can display a video from an any camera position among a large number of camera positions.
  • the camera position may be selected by the user or may be automatically selected by the client terminal 10.
  • Fig. 29 is a diagram illustrating an example of a camera position according to an embodiment of the present disclosure.
  • Fig. 29 illustrates, for example, a camera position 810, a camera position 811, a camera position 812, and a camera position 813.
  • a virtual camera C is disposed on an extension line connecting the goal and the ball.
  • the camera follows.
  • the virtual camera C is disposed at a position where a wide view angle can be obtained behind the ball in the sky.
  • Each camera position described above is a position at which the player is viewed from above, but the present disclosure is not limited thereto, and the client terminal 10 may set the position (viewpoint) of a specific player as the camera position and provide the subjective video of the player.
  • the viewpoint of the goalkeeper is set as the camera position, and a scene seen by the goalkeeper is provided.
  • the viewpoint of a player who shoots at the end of each scene also referred to as a chapter
  • a scene seen by the player is provided.
  • Fig. 30 is a diagram illustrating a game reproduction UI in the case of the camera position 810: Basic illustrated in Fig. 29. As illustrated in Fig. 30, a game reproduction UI 820 displays the video acquired by the virtual camera C disposed on the extension line connecting the goal and the ball. Note that a screen outline of the game reproduction UI 820 illustrated in Fig. 30 will be described later.
  • Fig. 31 is a diagram illustrating the game reproduction UI in the case of the camera position 811: Bird's eye illustrated in Fig. 29.
  • the game reproduction UI 830 the video acquired by the virtual camera C disposed at a position where the view angle is wide behind the ball in the sky is displayed. Note that the screen outline of the game reproduction UI 830 illustrated in Fig. 31 is common to the game reproduction UI 820 illustrated in Fig. 30.
  • FIG. 30 A screen outline of the game reproduction UI 820 will be described with reference to Fig. 30.
  • the game reproduction UI 820 displays, for example, an exit icon 821, a notification icon 822, an angle changeable label 823, a camera position switching icon 824, an avatar mode switching icon 825, a highlight label 826, a game progress information display 827, and a display switching icon 828.
  • the arrangement and shape of each display are not limited to the example illustrated in Fig. 30.
  • the exit icon 821 is an operation button for ending the game reproduction mode by the display control of the player object based on the tracking data (specifically, bone data) or the like of each player according to an embodiment of the present disclosure.
  • the game reproduction mode according to an embodiment of the present disclosure may be highlight reproduction in which reproduction is performed using tracking data of one or more specific scenes in tracking data of one game. Such a game reproduction mode by highlight reproduction performed in the stadium 430 in the virtual space is also referred to as a “highlight stadium” in the present specification.
  • the notification icon 822 indicates the presence or absence of notification to the user. For example, when a display indicating the number of notifications is displayed on the notification icon 822 and the notification icon 822 is tapped, a notification list is popped up.
  • the angle changeable label 823 is a display indicating whether or not the user can arbitrarily operate the camera angle.
  • the camera angle that is, the line-of-sight direction
  • the camera position 812 GK
  • the camera position 813 Shooter illustrated in Fig. 29
  • the camera angle is set to be inoperable by the user.
  • the user can change the camera angle in the up, down, left, and right directions by dragging or flicking an any place on the screen up, down, left, and right with one finger or the like.
  • the camera position switching icon 824 is an operation button for switching a large number of camera positions described with reference to Fig. 29.
  • the user can switch the video of the game reproduction UI 820 to the video of the camera position corresponding to the tapped icon by tapping an any icon among the Basic, Bird's eye, GK, and Shooter icons included in the camera position switching icon 824.
  • the avatar mode switching icon 825 is an operation button for changing the viewing mode of the game reproduction to the avatar mode.
  • the viewing mode includes, for example, a View mode and an avatar mode.
  • the avatar mode switching icon is displayed in the View mode as a default.
  • the user avatar is not displayed as illustrated in Figs. 28 and 29.
  • the avatar mode a user avatar is displayed within the field, allowing the user to manipulate the user avatar to view the game from a free perspective.
  • a display example of the avatar mode will be described later with reference to Fig. 34.
  • the highlight label 826 is a display indicating that the function currently used by the user in the stadium 430 (see Fig. 10) in the virtual space is the highlight stadium (game reproduction mode of highlight reproduction). Note that, in the stadium 430 in the virtual space according to an embodiment of the present disclosure, the function of the staging event “live stadium” performed in accordance with the development of the real time game in the real space described with reference to Figs. 20 to 26 can also be used. In this case, for example, a label indicating “LIVE VIEW” is displayed on the screen.
  • the game progress information display 827 is a display indicating the score of the game being reproduced, the opponent, the elapsed time, and the like.
  • the display switching icon 828 is an operation button for switching the playback indicator display. Each time the display switching icon 828 is tapped, the display of the playback indicator is switched between ON and OFF. A display example of the playback indicator will be described with reference to Fig. 32.
  • Fig. 32 is a diagram illustrating a display example of the playback indicator according to an embodiment of the present disclosure.
  • a playback indicator 840 is displayed at the lower portion of the screen in response to tapping of the display switching icon 828.
  • the display position of the playback indicator 840 is not particularly limited.
  • the playback indicator 840 includes a seek bar 841, a five second return button 842, a previous chapter jump button 843, a playback and stop button 844, a slow playback button 845, a next chapter jump button 846, a five second send button 847, and a chapter list display button 848.
  • the seek bar 841 indicates the playback position of the reproduced game.
  • the game reproduced in an embodiment of the present disclosure includes an aggregate of one or more specific scenes (hereinafter, also referred to as a chapter). Therefore, the seek bar 841 may have a break for respective chapters as illustrated in Fig. 32.
  • Fig. 33 is a diagram illustrating a display example of a chapter list according to an embodiment of the present disclosure.
  • a chapter list 852 is displayed.
  • the chapter list 852 includes a list of one or more specific scenes (chapter) in the game reproduced in an embodiment of the present disclosure.
  • the user can jump the playback time of the game video displayed on the game reproduction UI 850 to the playback start time of the selected chapter.
  • Fig. 34 is a diagram illustrating a display example of the game reproduction UI in the case of the avatar mode according to an embodiment of the present disclosure.
  • a user avatar U is displayed on the game reproduction UI 860 in the avatar mode.
  • the user may operate the controller 521 or the jump icon 522 to move the user avatar U within the field.
  • the camera position that is, the user viewpoint
  • the display mode of the angle changeable label 823 changes as illustrated in Fig. 34.
  • the collision detection between the user avatar U and the virtual object in the field such as the player object and the ball is not performed.
  • the client terminal 10 may not be able to perform scene control such as pause or fast forward.
  • Fig. 35 is a diagram for explaining a highlight playback of a game according to an embodiment of the present disclosure.
  • the client terminal 10 first acquires In and Out point information regarding each chapter in one game.
  • the In and Out point information is included in the game content data distributed from the content distribution server 306.
  • In and out point information regarding each scene may be automatically set from the game situation in the game data server 50 in advance.
  • the client terminal 10 searches for the In point of the scene 1 among the tracking data of the players and the ball included in the game content data and starts playback (game reproduction by moving the player objects), and after performing playback up to the Out point of the scene 1, searches for the In point of the next scene (scene 2) and starts playback of the next scene. Then, the client terminal 10 repeats the above steps until all the scenes (scenes 1 to 4 in the example illustrated in Fig. 35) set in one game are played back.
  • An information processing apparatus including: circuitry configured to: provide information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed, wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space; and provide information regarding the communication based on the avatar entering a second area located outside the first area, wherein the event is a subject of conversation in the communication room.
  • the display object includes information regarding the event taking place in the real space or the virtual space.
  • the information processing apparatus includes information regarding a team that a specific user group supports in the event.
  • the display object includes a specific scene of the event.
  • the specific scene of the event includes a highlighted video of the event.
  • the circuitry is further configured to: receive a selection of the information regarding selection of entry to the communication room from the user to enter the communication room; and provide, based on the selection from the user to enter the communication room, connection information corresponding to the communication room to the user.
  • the information processing apparatus includes to enter the communication room as a speaker and to enter the communication room as an audience.
  • the information regarding the communication includes audio information in the communication room, and wherein the circuitry is further configured to increase a volume of the audio information based on a decrease in distance between the avatar and a center of the virtual object.
  • the circuitry is further configured to generate the communication room including content viewed in the communication room by the user.
  • the information processing apparatus is further configured to perform bidirectional communication with the user based on the user entering the communication room.
  • the circuitry is further configured to generate the communication room including an emote icon for displaying an emote menu screen; and initiate display of an emote in the communication room based on a selection from the emote menu screen by the user.
  • the circuitry is further configured to generate the communication room including a stamp icon for displaying a stamp menu screen; and initiate display of a stamp in the communication room based a selection from the stamp menu screen by the user.
  • An information processing method including: providing information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed, wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space; and providing information regarding the communication based on the avatar entering a second area located outside the first area, wherein the event is a subject of conversation in the communication room.
  • An information processing apparatus including: circuitry configured to: provide, in a communication room related to a communication point where communication between users is performed, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, wherein information regarding selection of entry to a communication room is provided to a user based on an avatar operated by the user entering a first area that is the communication point where communication between users is performed, wherein information regarding the communication is provided based on the avatar entering a second area located outside the first area, and wherein the event is a subject of conversation in the communication room.
  • An information processing method including: providing, in a communication room related to a communication point where communication between users is performed, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, wherein information regarding selection of entry to a communication room is provided to a user based on an avatar operated by the user entering a first area that is the communication point where communication between users is performed, wherein information regarding the communication is provided based on the avatar entering a second area located outside the first area, and wherein the event is a subject of conversation in the communication room.
  • An information processing system including a control unit that performs control to dispose, in a virtual space, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, the event being a subject of conversation in a communication room, in a case where a position of an avatar operated by a user in the virtual space is included in a first area that is a communication point associated with the communication room in which communication between users is performed, control to transmit information regarding selection of entry to the communication room to a terminal used by the user, and in a case where a position of the avatar in the virtual space is included in a second area located outside the first area, control to transmit information regarding the communication to the terminal.
  • (B-2) The information processing system according to Item (B-1), in which the display object includes a display object that displays a specific scene of an event taking place in a real space.
  • (B-3) The information processing system according to Item (B-2), in which the display object includes a display object that displays a highlighted video of a game played in a real space.
  • (B-4) The information processing system according to Item (B-1), in which the display object includes a display object that displays information regarding a team that a specific user group supports in the event.
  • (B-5) The information processing system according to Item (B-4), in which the display object includes a display object related to a team that a specific user group supports in the event, and the information regarding selection of entry is transmitted to a terminal used by a user belonging to the specific user group.
  • (B-6) The information processing system according to any one of Items (B-1) to (B-5), in which the control unit transmits connection information for acquiring information by one or more different communication means used in the communication room to a terminal used by the user in a case where a request for entry of a user into the communication room based on a user input to the information regarding selection of entry is received from the terminal.
  • (B-7) The information processing device according to Item (B-6), in which the information by the communication means includes information regarding an audio, a character, or an image transmitted from a terminal of each of users who are in the communication room.
  • (B-8) The information processing system according to any one of Items (B-1) to (B-7), in which the information regarding selection of entry to the communication room includes information for connecting to a server that generates the communication room.
  • (B-9) The information processing system according to Item (B-8), in which the server performs control to transmit information regarding users who are in the communication room to a terminal of each user.
  • (B-10) The information processing system according to any one of Items (B-1) to (B-9), in which the control unit performs control to transmit information regarding a distribution content viewed in the communication room to the terminal of the user.
  • (B-11) The information processing system according to Item (B-10), in which the information regarding the distribution content includes information specifying the distribution content and information for connecting to a content distribution server to which the distribution content is distributed.
  • (B-12) The information processing system according to any one of Items (B-1) to (B-11), in which the control unit receives information regarding display of a user avatar displayed on a communication room screen displayed on the terminal of the user from the terminal of the user to transmit the information in real time to a terminal of another user who is in the communication room.
  • (B-13) The information processing system according to Item (B-12), in which the information regarding the display of the user avatar includes emote information for moving the user avatar or stamp information that is an image displayed near the user avatar and indicating emotional expression.
  • (B-14) The information processing system according to any one of Items (B-1) to (B-13), in a case where a user avatar corresponding to the user moves within a predetermined distance from the communication point, the control unit recognizes an intention to enter the communication room, and performs control to transmit connection information for entering the communication room to the terminal of the user.
  • (B-15) The information processing system according to any one of Items (B-1) to (B-14), in which the control unit performs control to instruct the terminal of the user to transition to a communication room screen when the user enters the communication room.
  • (B-16) The information processing system according to Item (B-15), in which the communication room screen displays a user avatar of a user who is in the communication room.
  • (B-17) The information processing system according to Item (B-16), in which a character or a stamp image input by a user corresponding to the user avatar is displayed near the user avatar.
  • (B-18) The information processing system according to any one of Items (B-15) to (B-17), in which the communication room screen displays a content.
  • An information processing method including by a processor, disposing, in a virtual space, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, the event being a subject of conversation in a communication room, in a case where a position of an avatar operated by a user in the virtual space is included in a first area that is a communication point associated with the communication room in which communication between users is performed, transmitting information regarding selection of entry to the communication room to a terminal used by the user, and in a case where a position of the avatar in the virtual space is included in a second area located outside the first area, transmitting information regarding the communication to the terminal.
  • (B-20) A program causing a computer to function as a control unit that performs control to dispose, in a virtual space, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, the event being a subject of conversation in a communication room, in a case where a position of an avatar operated by a user in the virtual space is included in a first area that is a communication point associated with the communication room in which communication between users is performed, control to transmit information regarding selection of entry to the communication room to a terminal used by the user, and in a case where a position of the avatar in the virtual space is included in a second area located outside the first area, control to transmit information regarding the communication to the terminal.
  • An information processing system including a control unit that disposes a distribution video of an event taking place in a real space in a virtual space in which a user can view the distribution video of the event, and determines event coordinated performance experienced by a plurality of users in the virtual space according to development of an event taking place in the real space and a reaction of the plurality of users.
  • B-23) The information processing system according to any one of Items (B-21) to (B-22), in which the performance is determined on the basis of a reaction with a largest reaction amount among reactions of the plurality of users.
  • (B-24) The information processing system according to Item (B-21), in which the performance is an emote of an avatar, and the emote is reflected in one or more avatars existing in the virtual space.
  • (B-25) The information processing system according to Item (B-24), in which the emote of the avatar is reflected in an avatar in a non-operated state existing in the virtual space.
  • (B-26) The information processing system according to Item (B-24), in which the emote of the avatar is reflected in an NPC existing in the virtual space.
  • Virtual space system 200 (200a, 200b, 200c, ...) Virtual space management server 210 Communication unit 220 Control unit 221 Virtual space management unit 222 Content management unit 223 User management unit 230 Storage unit 10 Client terminal 110 Communication unit 120 Control unit 121 Virtual space processing unit 122 Display control unit 130 Operation input unit 140 Sensor 150 Display unit 160 Audio output unit 170 Storage unit 30 backend system 301 User information management server 302 Hosting server 303 Matching server 304 Text chat server 305 Voice chat server 306 Content distribution server 307 talking table server 308 Event coordination server 309 Trading management server 310 Data analysis server

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

There is provided an information processing apparatus including circuitry configured to provide information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed, wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, and provide information regarding the communication based on the avatar entering a second area located outside the first area, the event being a subject of conversation in the communication room.

Description

INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM FOR COMMUNICATION POINTS REGARDING EVENTS CROSS REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of Japanese Priority Patent Application JP 2022-181346 filed on November 11, 2022, the entire contents of which are incorporated herein by reference.
The present disclosure relates to an information processing system, an information processing method, and a program.
In recent years, as one of real time communication using an image, an audio, or a character, a system capable of performing communication through an avatar operated by each user in a virtual space in which 3D models are disposed has become widespread. In such a system, for example, a video from an any viewpoint (free viewpoint) according to a user operation is generated and provided to the user as the video of the virtual space. The video of the virtual space is provided using a display device such as a head mounted display (HMD) that covers the entire field of view of the user, a smartphone, a tablet terminal, or a personal computer (PC).
Regarding communication between users performed in a virtual space, for example, PTL 1 below discloses a technology for establishing communication between communication terminals used by respective users in a case where scheduled users gather in a predetermined space and at a predetermined time on the virtual space.
WO 2013/054748
Summary
However, in the above-described technology, it is necessary to determine participants, a space, and a time of communication in advance. Further convenience of communication between some users in the virtual space may be required.
Therefore, the present disclosure proposes an information processing system, an information processing method, and a program capable of further enhancing convenience of communication between some users at a predetermined place in a virtual space.
According to an aspect of the present disclosure, there is provided an information processing apparatus including: circuitry configured to: provide information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed, wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space; and provide information regarding the communication based on the avatar entering a second area located outside the first area, the event being a subject of conversation in the communication room.
Further, according to another aspect of the present disclosure there is provided an information processing method including: providing information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed, wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space; and providing information regarding the communication based on the avatar entering a second area located outside the first area, the event being a subject of conversation in the communication room.
Further, according to another aspect of the present disclosure there is provided an information processing apparatus including: circuitry configured to:
provide, in a communication room related to a communication point where communication between users is performed, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, information regarding selection of entry to a communication room being provided to a user based on an avatar operated by the user entering a first area that is the communication point where communication between users is performed,
wherein information regarding the communication is provided based on the avatar entering a second area located outside the first area, and the event being a subject of conversation in the communication room.
Further, according to another aspect of the present disclosure, there is provided an information processing method including: providing, in a communication room related to a communication point where communication between users is performed, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, information regarding selection of entry to a communication room being provided to a user based on an avatar operated by the user entering a first area that is the communication point where communication between users is performed, information regarding the communication being provided based on the avatar entering a second area located outside the first area, and the event being a subject of conversation in the communication room.
Fig. 1 is a diagram illustrating an overall configuration of a virtual space provision system according to an embodiment of the present disclosure. Fig. 2 is a diagram for explaining each space included in a virtual space S. Fig. 3 is a diagram for explaining a space by genre included in a sport space S3 according to an embodiment of the present disclosure. Fig. 4 is a block diagram illustrating an example of a configuration of a virtual space management server 200 according to an embodiment of the present disclosure. Fig. 5 is a diagram illustrating a specific example of service servers included in a backend system 30 according to an embodiment of the present disclosure. Fig. 6 is a block diagram showing an example of a configuration of a client terminal 10 according to an embodiment of the present disclosure. Fig. 7 is a sequence diagram illustrating an example of a flow of connection processing to a virtual space according to an embodiment of the present disclosure. Fig. 8 is a diagram illustrating an example of a display screen according to an embodiment of the present disclosure. Fig. 9 is a diagram illustrating an example of a display screen according to an embodiment of the present disclosure. Fig. 10 is a diagram illustrating an example of a simple map of an IP content specific area according to an embodiment of the present disclosure. Fig. 11 is a diagram illustrating a mechanism of game reproduction using bone data of a player according to an embodiment of the present disclosure. Fig. 12 is a diagram illustrating a display screen inside a bar on which a talking table 600 is disposed according to an embodiment of the present disclosure. Fig. 13 is a diagram for explaining an example of the shape of the talking table 600. Fig. 14 is a diagram for describing a case where a state in a talking room is disclosed according to a distance to the talking table 600 according to an embodiment of the present disclosure. Fig. 15 is a view illustrating a display screen of the terminal (client terminal 10) used by the user who operates the user avatar in a case where the position in the virtual space of the user avatar operated by the user is included in the area E2. Fig. 16 is a view illustrating a display screen of the terminal (client terminal 10) used by the user who operates the user avatar in a case where the position in the virtual space of the user avatar operated by the user is included in the area E1. Fig. 17 is a diagram illustrating an example of effect display of the talking table 600 according to the excitement in the talking room according to an embodiment of the present disclosure. Fig. 18 is a sequence diagram illustrating an example of operation processing from generation of the talking table 600 to entry according to an embodiment of the present disclosure. Fig. 19 is a diagram illustrating details of communication connection between a client terminal 10 and each server at the time of entering the talking room according to an embodiment of the present disclosure. Fig. 20 is a view illustrating an example of a talking room screen according to an embodiment of the present disclosure. Fig. 21 is a diagram illustrating another example of the talking room screen according to an embodiment of the present disclosure. Fig. 22 is a diagram for explaining a mechanism of event coordination according to an embodiment of the present disclosure. Fig. 23 is a diagram illustrating an example of a display screen in a live stadium according to an embodiment of the present disclosure. Fig. 24 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. Fig. 25 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. Fig. 26 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. Fig. 27 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. Fig. 28 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. Fig. 29 is a diagram illustrating an example of a camera position according to an embodiment of the present disclosure. Fig. 30 is a diagram illustrating a game reproduction UI in a case of a camera position 810: Basic illustrated in Fig. 29. Fig. 31 is a diagram illustrating a game reproduction UI in a case of a camera position 811: Bird's eye illustrated in Fig. 29. Fig. 32 is a diagram for describing a screen outline of a game reproduction UI according to an embodiment of the present disclosure. Fig. 33 is a diagram illustrating a display example of a chapter list according to an embodiment of the present disclosure. Fig. 34 is a diagram illustrating a display example of a game reproduction UI in the case of an avatar mode according to an embodiment of the present disclosure. Fig. 35 is a diagram for explaining a highlight playback of a game according to an embodiment of the present disclosure.
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Note that, in the present specification and the drawings, components having substantially the same functional configuration are denoted by the same reference numerals, and redundant explanations are omitted.
Furthermore, the description is given in the following order.
1. Overall configuration of virtual space provision system according to an embodiment of the present disclosure
2. Configuration example
2-1. Configuration example of virtual space management server 200
2-2. Configuration example of backend system 30
2-3. Configuration example of client terminal 10
3. Connection To virtual space
4. Display screen example
5. Specific example of virtual space
5-1. IP content area
5-2. Talking table
5-3. Event coordination
5-4. Game reproduction UI
6. Supplement
<<1. Overall configuration of virtual space provision system according to one embodiment of the present disclosure>>
Fig. 1 is a diagram illustrating an overall configuration of a virtual space provision system according to an embodiment of the present disclosure. The virtual space provision system (an example of an information processing system) according to an embodiment of the present disclosure is an information processing system including one or more virtual space management servers 200 (an example of an information processing device), a client terminal 10 (an example of a terminal) used by each user, and a backend system 30.
The virtual space management server 200 constructs (generates) and manages a virtual space (VR) in which a virtual object is disposed, and provides virtual space information to one or more client terminals 10 communicably connected via a network 40. The virtual space information includes, for example, information regarding a virtual object disposed in the virtual space, map information regarding the virtual space, information regarding an event (hereinafter, also referred to as a virtual event) taking place in the virtual space, positional information regarding each user in the virtual space, and the like. The virtual space can be reproduced (copied) by each client terminal 10 on the basis of these pieces of information. Specifically, each client terminal 10 arranges the virtual object on the basis of the map information regarding the virtual space, and draws the virtual space video from the user viewpoint (user avatar viewpoint). Note that the virtual space management server 200 may generate a video from an any viewpoint (for example, a user viewpoint) in the virtual space to transmit the video to the client terminal 10.
In the virtual space, for example, a virtual object formed by a 3DCG model is disposed. The virtual space may be a space in which an action of the user is limited to a behavior scenario prepared in advance like a game, or may be a space in which the user can freely act without being limited to the behavior scenario. Furthermore, the virtual space may be a space in which an interaction with another user can be performed by bidirectionally transmitting and receiving an audio, a character, or the like. Furthermore, the virtual space may be a space designed according to the original world view, or may be a space called a so-called mirror world that faithfully reproduces buildings, facilities, roads, cities, streets, or the like existing in the real space. Furthermore, in the virtual space, interaction with other users, exchange of education, or social activities such as work, and economic activities such as sales or purchase of products can also be performed. The virtual space may be a space called a metaverse.
In the virtual space, there may be a user avatar that is a virtual self of the user. The user avatar is generated by, for example, a character 3D model. Each user can arbitrarily customize the appearance of his/her character 3D model. The user avatar may be operated by the user and moved in the virtual space. In addition, a gesture or a facial expression of the user avatar may also be operated by the user. In the present specification, the positional information regarding the user in the virtual space may be information indicating the position of the user avatar. Furthermore, the user viewpoint in the virtual space may be a viewpoint of the user avatar or a viewpoint including the user avatar within the view angle. The user viewpoint can be arbitrarily switched by a user operation.
The user avatar is one of the virtual objects disposed in the virtual space. The information regarding the user avatar of each user is also shared as virtual space information by a large number of client terminals 10 communicably connected to the virtual space management server 200. Furthermore, each user can communicate by an audio, a character, or an image in the virtual space. The communication may be performed via the virtual space management server 200 or via the backend system 30. The backend system 30 includes servers ( service servers 31, 32, 33, ...) that provide various services to users who use the virtual space. Examples thereof include a server that provides audio communication, a server that provides character communication, and a server that provides image (moving image and still image) communication. The client terminal 10 acquires communication channel information (an example of connection information) from the virtual space management server 200, and is communicably connected to a predetermined service server. Then, the client terminal 10 acquires communication channel information via the service server and can communicate with another client terminal 10 communicably connected to a predetermined service server.
Furthermore, in the virtual space provision system according to an embodiment of the present disclosure, in order to distribute the processing load of the virtual space management server 200, the servers may be multiplexed according to the number of users of the virtual space, that is, the number of terminals connected to the virtual space management server 200. In the example illustrated in Fig. 1, a virtual space system 20 includes a plurality of virtual space management servers 200 (200a, 200b, 200c, ...). For example, in a case where it is desired to allow 1000 people to use a certain virtual space, 10 virtual space management servers 200 connectable to 100 terminals are prepared, and each virtual space management server 200 provides the same virtual space (the same virtual object and the same map information). Each virtual space management server 200 manages positional information (positional information regarding the 100 persons) regarding the users corresponding to the 100 terminals connected for communication. As a result, it is possible to provide the same virtual space for a total of 1000 users every 100 users. In each virtual space, rough elements such as a flow of time and execution of a virtual event are synchronized, but an interaction between users can be performed for each virtual space.
Multiplexing of servers according to the number of terminals connected, that is, server increase/decrease management may be performed by a service provided by the backend system 30. Specifically, in a case where the number of terminals connected to one virtual space management server 200 exceeds a predetermined number (for example, 100), the backend system 30 may construct a new virtual space management server 200 that provides the same virtual space as the one virtual space management server 200 (the same virtual object and the same map information). Note that a predetermined number of virtual space management servers 200 may be constructed in advance.
The server environment is not particularly limited. For example, the virtual space management server 200 may be a physical server or a virtual server executed on the physical server. The physical server or the virtual server described here may be a server provided by a hosting service, or may be an own server prepared by a business operator who provides a service for providing a virtual space. Furthermore, the function of one virtual space management server 200 may be realized by a plurality of physical servers or a plurality of virtual servers. Each virtual space management server 200 manages the positional information regarding the users corresponding to the plurality of client terminals 10 connected for communication.
The client terminal 10 is a terminal used by the user. For example, the client terminal 10 is realized by a smartphone, a tablet terminal, a personal computer (PC), a head mounted display (HMD) covering the entire field of view, a glasses-type device, a projector, a console game machine, or the like. The client terminal 10 reproduces the virtual space on the basis of the information regarding the virtual space received from the virtual space management server 200. Furthermore, the client terminal 10 generates a video from the user viewpoint in the virtual space, and displays and outputs the video. In addition, the client terminal 10 also outputs an audio in the virtual space as appropriate. Note that the client terminal 10 may receive the video from the user viewpoint generated by the virtual space management server 200 from the virtual space management server 200 and display and output the video.
The client terminal 10 can control the virtual object disposed in the virtual space according to the user operation. For example, the client terminal 10 controls the position, the gesture, the facial expression, and the like of the user avatar according to the user operation. In addition, the client terminal 10 transmits information (for example, the position, the gesture, the facial expression, and the like of the user avatar) regarding the virtual object changed according to the user operation to the virtual space management server 200 in real time.
Furthermore, the client terminal 10 may transmit various types of information to the virtual space management server 200, another client terminal 10, or the backend system 30 according to a user operation. For example, the client terminal 10 may transmit login information to the virtual space management server 200 or the backend system 30 to request user authentication. Furthermore, the client terminal 10 may transmit the audio of the user, the input character information, the selected image, and the like as the communication information to the virtual space management server 200, another client terminal 10, or the backend system 30.
(Content-specific virtual space)
As an example of the virtual space, in an embodiment of the present disclosure, there is a space for users to interact with each other centering on various kinds of contents (including IP content to be described later) such as a movie, music, sports, or animation. Fig. 2 is a diagram for explaining each space included in a virtual space S. As illustrated in Fig. 2, the virtual space S provided in the virtual space system 20 according to an embodiment of the present disclosure includes a user personal space S1 which is a virtual space for an individual user, a public space S2 which is a virtual space in which users interact with each other regardless of the content, and each virtual space in which users interact with each other regarding each content. Examples of each virtual space in which the users interact with each other regarding each content include a sport space S3 that is a virtual space in which users interact with each other regarding sports, a music space S4 that is a virtual space in which users interact with each other regarding music, a movie space S5 that is a virtual space in which users interact with each other regarding movies, and an animation space S6 that is a virtual space in which users interact with each other regarding animation. The content mentioned here is an example, and the content according to the present disclosure is not limited thereto.
Furthermore, each virtual space in which users interact with each other regarding each content may be a virtual space for each genre that is further hierarchized. Fig. 3 is a diagram for explaining a space by genre included in the sport space S3 according to an embodiment of the present disclosure. As illustrated in Fig. 3, examples thereof include a soccer space S3-1 in which users particularly interact with each other regarding soccer, a basketball space S3-2 in which users particularly interact with each other regarding basketball, and a tennis space S3-3 in which users particularly interact with each other regarding tennis. Furthermore, an example of a virtual space in a lower hierarchy of a space by genre of each sport illustrated in Fig. 3 may include a virtual space in which users (fans) of each sport team interact with each other be. Note that the virtual space is not limited to being hierarchized by genre. The virtual space may be a space in which each space corresponding to various IP content exists in parallel. For example, the virtual space according to an embodiment of the present disclosure may be a virtual space in which a space corresponding to each of a certain baseball team, a certain soccer team, and a certain animation exists in parallel.
In this way, as the virtual space by content, a virtual space for users who are interested in the content to interact with each other can be considered. The user can have a conversation about the content with another user having the same preference and the same interest and enjoy the content more.
The virtual space system 20 may provide all or some of the virtual spaces as illustrated in Figs. 2 and 3. For example, the virtual space system 20 may prepare the user personal space S1, the public space S2, and a virtual space for interaction between specific sport team fans. In an embodiment of the present disclosure, as an example, the virtual space system 20 provides a virtual space for interaction between fans of a specific soccer team.
<<2. Configuration example>>
<2-1. Configuration example of virtual space management server 200>
Fig. 4 is a block diagram illustrating an example of a configuration of a virtual space management server 200 according to an embodiment of the present disclosure. As illustrated in Fig. 4, the virtual space management server 200 includes a communication unit 210, a control unit 220, and a storage unit 230.
(Communication unit 210)
The communication unit 210 transmits and receives data to and from an external device in a wired or wireless manner. The communication unit 210 is communicably connected to each of the client terminal 10 and the backend system 30 by using, for example, a wired/wireless local area network (LAN), a Wi-Fi (registered trademark), a Bluetooth (registered trademark), a mobile communication network (long term evolution (LTE), fourth generation mobile communication system (4G), fifth generation mobile communication system (5G)), or the like.
(Control unit 220)
The control unit 220 functions as an arithmetic processing device and a control device, and controls the overall operation in the virtual space management server 200 according to various programs. The control unit 220 is realized by an electronic circuit such as a central processing unit (CPU) and a microprocessor, for example. Furthermore, the control unit 220 may include a read only memory (ROM) that stores programs, operation parameters, and the like to be used, and a random access memory (RAM) that temporarily stores parameters and the like that change appropriately.
The control unit 220 appropriately performs processing on the basis of data received from an external device, and controls storage in the storage unit 230 and transmission of data to the external device.
The control unit 220 also functions as a virtual space management unit 221, a content management unit 222, and a user management unit 223.
The virtual space management unit 221 manages the virtual space. Specifically, the virtual space management unit 221 performs generation, acquisition, addition, update, and the like of various types of information for constructing (generating) the virtual space. The various types of information are virtual space information such as, for example, information regarding a 3DCG model which is a virtual object disposed in the virtual space, color settings, an image, map information, virtual event information, sound effects, and back ground music (BGM).
The virtual object includes a user-operable object, a user-inoperable object, a background image, a staging effect, and the like. The virtual object also includes information regarding each user avatar disposed in the virtual space. Examples of the information regarding each user avatar include appearance information regarding the user avatar, real time user positional information, real time facial expression or gesture information regarding the user avatar, and a profile of the corresponding user. The information regarding each user avatar is managed by the user management unit 223 described later.
The virtual event information includes information regarding time, progress, performance, and the like of the virtual event. The virtual space management unit 221 may notify the client terminal 10 that a predetermined virtual event is perform at a preset time, or may notify the client terminal 10 that a virtual event corresponding to a case where a predetermined trigger is detected in the virtual space is performed. The virtual event information includes information necessary for executing the virtual event. In addition, the virtual event may be a content. The notification of the virtual event to the client terminal 10 may include information instructing to acquire predetermined content from the backend system 30 and execute the content at a predetermined place in the virtual space.
The virtual space management unit 221 appropriately transmits the virtual space information to the client terminal 10. Furthermore, the virtual space management unit 221 can also realize communication between users sharing the virtual space by transmitting communication information such as a character, an audio, or an image received from the client terminal 10 to another client terminal 10.
The virtual space management unit 221 receives the information regarding the user operation from the client terminal 10 to transmit the information as the virtual space information to the other client terminals 10 in real time. In addition, the virtual space management unit 221 performs the processing of calculating the collision detection according to the information regarding the user operation received from the client terminal 10 to transmit the calculation result to each client terminal 10 at appropriate times. As a result, the statuses of the virtual spaces reproduced by the respective client terminals 10 are synchronized. The information regarding the user operation may include operations related to a user avatar movement, such as moving and jumping of the user avatar, and operations related to an emotional expression of the user avatar, such as emotes that cause the user avatar to perform a pose, a gesture, or the like, and a stamp displayed above the head of the user avatar.
The content management unit 222 manages information regarding the content shared in the virtual space. For example, the content management unit 222 manages information indicating the content currently shared in a content sharing area provided in the virtual space. The content sharing area may be, for example, a virtual object of one or more large displays disposed in a virtual space, or may be a stage, a forum, a field, or the like on which a 3D content or the like is reproduced. The content itself may be transmitted from the backend system 30 to the client terminal 10. Such content management may be managed by the virtual space management unit 221 as one of the virtual event information described above.
The user management unit 223 manages information regarding a user who uses the virtual space provided by the virtual space management server 200. More specifically, the user management unit 223 manages information regarding a user corresponding to the client terminal 10 to be connected for communication. The information regarding the user includes, for example, profile of the user (user name, user ID, icon image, and the like), appearance information regarding the user avatar (image, character type, component, or the like), status information regarding the user avatar, information regarding a virtual item owned by the user, information regarding virtual currency or points available in the virtual space by the user, real time user positional information in the virtual space (that is, positional information regarding the user avatar), and motion information about a real time gesture, a facial expression, a posture, or a motion state of the user avatar. The information regarding the user is transmitted from the client terminal 10. Furthermore, part of the information regarding the user may be transmitted from the backend system 30.
(Storage unit 230)
The storage unit 230 is realized by a ROM that stores programs, operation parameters, and the like used for processing of the control unit 220, and a RAM that temporarily stores parameters and the like that change as appropriate. The storage unit 230 stores virtual space information.
<2-2. Configuration example of backend system 30>
The backend system 30 includes servers that provide various services. Each server may be provided by a different business operator. In addition, the service provided by each server can be appropriately customized and used by a business operator providing the virtual space.
Each server of the backend system 30 may be communicably connected to the virtual space management server 200 to transmit and receive information, or may communicate with the client terminal 10 to transmit and receive information. The virtual space management server 200 can provide various services to the user who uses the virtual space in cooperation with each server.
Fig. 5 is a diagram illustrating a specific example of a service servers included in the backend system 30 according to an embodiment of the present disclosure. The service servers illustrated in Fig. 5 are an example, and the present disclosure is not limited thereto. Each service server will be described below.
A user information management server 301 manages information regarding a user who uses the virtual space. For example, the user information management server 301 stores account information regarding the user, a profile of the user, a user characteristics (age, gender, country, language used, history, liking/preference, viewing history of the content, participation history in virtual events, interaction history in the virtual space, and the like), item information possessed by the user, appearance information regarding the user avatar, status information regarding the user avatar, and the like in a data store (storage unit). Furthermore, the user information management server 301 can also perform authentication of the user using the account information regarding the user. Although details will be described later with reference to Fig. 7, the user information management server 301 performs user authentication in response to a request from the client terminal 10, and in a case where the authentication is successful, calls user information such as a profile of the user and appearance information regarding the user avatar from the data store to transmit the user information to the client terminal 10.
A hosting server 302 performs server multiplexing according to the number of terminals connected to the virtual space system 20, that is, server increase/decrease management. For example, in a case where the number of terminals connected to one virtual space management server 200 exceeds a predetermined number (for example, 100), the hosting server 302 constructs a new virtual space management server 200 that provides the same virtual space as the one virtual space management server 200 (the same virtual object and the same map information).
In a case where there is a plurality of virtual space management servers 200 that provide the same virtual space, a matching server 303 determines to which virtual space management server 200 the client terminal 10 that has made the connection request after the user authentication is completed is connected. As described above, for example, in a case where the number of connections is limited for each virtual space management server 200, the matching server 303 acquires information regarding the current number of connections of each virtual space management server 200 from the virtual space system 20, and instructs the client terminal 10 to be communicably connected to the virtual space management server 200 having a free space. Furthermore, the matching server 303 may determine the virtual space management server 200 to be connected according to the user information in addition to the number of connections. For example, the matching server 303 may instruct users having similar user characteristics to communicably connect to the same virtual space management server 200.
A text chat server 304 is a server that provides a mechanism in which users using a virtual space have a conversation using text (character information) as a communication means. Specifically, the text chat server 304 creates a virtual chat room (text channel) and performs control so that character information can be exchanged between users who are in the chat room. The chat room is generated in response to a request from the virtual space management server 200 or the client terminal 10. Furthermore, the text chat server 304 returns session information (connection information) for connecting to the generated chat room to the virtual space management server 200 or the client terminal 10. The client terminal 10 is communicably connect to the text chat server 304 on the basis of the session information and executes text chat.
A voice chat server 305 is a server that provides a mechanism in which users using a virtual space have a conversation using a voice (audio information) as a communication means. Specifically, the voice chat server 305 creates a virtual chat room (voice channel) and performs control so that audio information can be exchanged between users who are in the chat room. The chat room is generated in response to a request from the virtual space management server 200 or the client terminal 10. In addition, the voice chat server 305 returns session information (connection information) for connecting to the generated chat room to the virtual space management server 200 or the client terminal 10. The client terminal 10 is communicably connect to the voice chat server 305 on the basis of the session information and executes voice chat.
The communication means between the users using the virtual space is not limited to the above-described character information and audio information, but for example, it is also assumed that a moving image (video/streaming video) is used. Although not illustrated in Fig. 5, for example, a server that provides a mechanism for performing conversation using a moving image (video/streaming video) may be included in the backend system 30.
A content distribution server 306 controls distribution of the content viewed in the virtual space. The content distribution server 306 can acquire and distribute the content from various content servers (servers that store content). In an embodiment of the present disclosure, a virtual space is reproduced (virtual objects are disposed on the basis of map information) by an application (also referred to as a client application) operating in each client terminal 10, and a virtual space video from a user viewpoint is generated. The content distribution server 306 is communicably connected to the client terminal 10 to transmit the content incorporated in the virtual space reproduced by the client terminal 10 to the client terminal 10. The content incorporated in the virtual space is, for example, a moving image displayed on a large display (virtual object) disposed in the virtual space. Furthermore, the content incorporated in the virtual space is, for example, a 3D video displayed in a predetermined field in the virtual space.
A talking table server 307 manages a talking room (communication room) associated with a talking table (virtual object) disposed at an any place in the virtual space. In the talking room, each user can have a conversation using one or more different types of communication means (for example, text chat and voice chat). The generation of the talking room is performed in response to a request from the virtual space management server 200 or the client terminal 10. Furthermore, the talking table server 307 returns session information (connection information) for connecting to the generated talking room to the virtual space management server 200 or the client terminal 10. Details of the talking room will be described below with reference to Figs. 12 to 19. Note that the talking table is an example of a virtual object disposed at a communication point set at an any place in the virtual space as a mark of a place where a plurality of users gathers for an interaction in the virtual space. Any virtual object serving as a mark may be disposed at the communication point, and may be, for example, a chair, a pole, a parasol, a stage, or the like in addition to the table.
The event coordination server 308 controls performance in the virtual space according to development of an event taking place in the real space. The performance in the virtual space may include a virtual event. For example, an event coordination server 308 can coordinate the performance in the virtual space with the event in the real space by executing the corresponding performance in real time in the virtual space according to the development of the game performed in the real space. More specifically, the event coordination server 308 instructs the client terminal 10 to execute performance.
As a result, the user can enjoy an experience of enjoying excitement together in the virtual space with other users viewing the same game while viewing the game (an example of an event in the real space) performed in the real space, for example. The user may view the game in the real space on, for example, a television, the Internet, a radio, or the like, or may view the game in the virtual space. Details of event coordination are described below with reference to Figs. 20 to 26.
A trading management server 309 realizes the trade of items, products, tickets, and the like in the virtual space, and stores data regarding the trade. For trade in the virtual space, for example, virtual currency can be used. The trading management server 309 may be a server that provides a charging system.
A data analysis server 310 analyzes various pieces of data such as the behavior of each user avatar in the virtual space, the user characteristics of the user avatar performing a specific action, and the like. The analysis result may be presented to a business operator or the like that provides the virtual space.
<2-3. Configuration example of client terminal 10>
Fig. 6 is a block diagram showing an example of a configuration of the client terminal 10 according to an embodiment of the present disclosure. As illustrated in Fig. 6, the client terminal 10 includes a communication unit 110, a control unit 120, an operation input unit 130, a sensor 140, a display unit 150, an audio output unit 160, and a storage unit 170.
(Communication unit 110)
The communication unit 110 is communicably connected to the virtual space management server 200 in a wired or wireless manner to transmit and receive data. The communication unit 110 can perform communication using, for example, a wired/wireless LAN, a Wi-Fi (registered trademark), a Bluetooth (registered trademark), infrared communication, a mobile communication network (fourth-generation mobile communication system (4G), fifth-generation mobile communication system (5G)), or the like.
(Control unit 120)
The control unit 120 functions as an arithmetic processing device and a control device, and controls an overall operation in the client terminal 10 in accordance with various programs. The control unit 120 is realized by, for example, an electronic circuit such as a CPU or a microprocessor. Furthermore, the control unit 120 may include a ROM that stores programs to be used, operation parameters, and the like, and a RAM that temporarily stores parameters and the like that change as appropriate.
The control unit 120 also functions as a virtual space processing unit 121 and a display control unit 122.
The virtual space processing unit 121 performs various processes for providing the user with the virtual space experience in appropriate cooperation with the virtual space management server 200 and the backend system 30. Such processing may be performed by a client application downloaded in advance by the client terminal 10. The virtual space processing unit 121 performs, for example, user registration processing and login processing for using the virtual space. In the user registration processing and the login processing, for example, data is appropriately transmitted and received to and from the user information management server 301 via the communication unit 110. The virtual space processing unit 121 generates the virtual space on the basis of the virtual space information received from the virtual space management server 200. Specifically, the virtual space processing unit 121 arranges the virtual objects on the basis of the map information received from the virtual space management server 200 and reproduces the virtual space.
Furthermore, the virtual space processing unit 121 may incorporate various pieces of information acquired from the backend system 30 into the virtual space. For example, the virtual space processing unit 121 may display the video distributed from the content distribution server 306 on a large display disposed in the virtual space.
Furthermore, the virtual space processing unit 121 controls a position, a facial expression of the face, a gesture (emotional expression by the hand or the entire body), and the motion state (sitting, standing, jumping, running, etc.) of the user avatar disposed in the virtual space according to the user operation. The virtual space processing unit 121 continuously transmits the user operation information to the virtual space management server 200 in real time. The virtual space processing unit 121 reflects the information regarding another user avatar continuously transmitted from the virtual space management server 200 on the another user avatar disposed in the virtual space in real time. The virtual space processing unit 121 may realize the audio conversation between the users by transmitting the audio of the user acquired by the sensor 140 to the virtual space management server 200 and outputting the audio of another user received from the virtual space management server 200 from the audio output unit 160.
Furthermore, the virtual space processing unit 121 generates a video (display screen) from the user viewpoint in the virtual space. Furthermore, the virtual space processing unit 121 may generate a display screen in which a user avatar operation button, a setting screen, a notification screen, a menu screen, and the like are superimposed on the video from the user viewpoint. Furthermore, the virtual space processing unit 121 may display various pieces of information acquired from the backend system 30 on the display screen. For example, the virtual space processing unit 121 may display a screen of a text chat between users performed via the text chat server 304 on the basis of the information received from the text chat server 304. The text chat screen may be updated in real time.
Furthermore, the virtual space processing unit 121 may realize an audio conversation between the users via the voice chat server 305 by transmitting the audio of the user acquired by the sensor 140 to the voice chat server 305 and outputting the audio of another user received from the voice chat server 305 from the audio output unit 160.
Note that, in an embodiment of the present disclosure, a case where the virtual space is reproduced in the virtual space processing unit 121 will be described, but the present disclosure is not limited thereto. For example, the video from the user viewpoint in the virtual space may be generated and transmitted (streamed) by the virtual space management server 200, and the virtual space processing unit 121 may display the video received from the virtual space management server 200 on the display unit 150, so that the user may be provided with the virtual space experience.
The display control unit 122 performs control to display an image on the display unit 150. For example, the display control unit 122 performs control to display the display screen generated by the virtual space processing unit 121 on the display unit 150.
(Operation input unit 130)
The operation input unit 130 receives an operation instruction by the user to output the content of the operation to the control unit 120. The operation input unit 130 may be, for example, a touch sensor, a pressure sensor, or a proximity sensor. In addition, the operation input unit 130 may have a physical configuration such as a button, a switch, and a lever. The user can operate the user avatar in the virtual space using the operation input unit 130.
(Sensor 140)
The sensor 140 has a function of detecting (acquiring) various types of information regarding the user or around the user. The sensor 140 shown in Fig. 6 may include a number of sensors. For example, the sensor 140 may be a microphone (microphone) that collects sound. Furthermore, the sensor 140 may be a camera that images the user or the periphery of the user. Furthermore, the sensor 140 may be a positional information measurement unit that measures the position (absolute position or relative position) of the user. Furthermore, the sensor 140 may be various sensors (camera, acceleration sensor, angular velocity sensor, geomagnetic sensor, infrared sensor, depth sensor, and biometric sensor) that detect a facial expression, an emotion, a line-of-sight directions, a posture, a limb movement, a head direction, biometric information, and the like of the user. Furthermore, the sensor 140 may include a sensor capable of detecting a total of nine axes including a three-axis gyro sensor, a three-axis acceleration sensor, and a three-axis geomagnetic sensor.
Note that the information detected by the sensor 140 may be used as information regarding a user operation in the virtual space processing unit 121. For example, the virtual space processing unit 121 may control the user avatar in the virtual space according to the information detected by the sensor 140.
(Display unit 150)
The display unit 150 has a function of displaying an image under the control of the display control unit 122. For example, the display unit 150 may be a display panel such as a liquid crystal display (LCD) or an organic electro luminescence (EL) display.
Furthermore, the display unit 150 may be realized by an HMD that covers the entire field of view of the user. In this case, the display unit 150 displays the left-eye image and the right-eye image on the left and right screens respectively fixed to the left and right eyes of the user. The screen of the display unit 150 includes, for example, a display panel such as a liquid crystal display (LCD) or an organic EL display, or a laser scanning display such as a retina direct drawing display. Furthermore, the display unit 150 may include an imaging optical system that enlarges and projects the display screen to form an enlarged virtual image having a predetermined view angle on the user's pupil.
(Audio output unit 160)
The audio output unit 160 outputs audio information under the control of the control unit 120. For example, in a case where the client terminal 10 is an HMD, the audio output unit 160 may be configured as a headphone worn on the head of the user, or may be realized by an earphone or a bone conduction speaker.
(Storage unit 170)
The storage unit 170 is realized by a ROM that stores programs, operation parameters, and the like used for processing of the control unit 120, and a RAM that temporarily stores parameters and the like that change as appropriate. The storage unit 170 stores, for example, a client application that executes various processes related to the virtual space. Furthermore, the storage unit 170 may store user information such as a user profile and user avatar information, and virtual space information received from the virtual space management server 200.
Although the configuration of the client terminal 10 has been specifically described above, the configuration of the client terminal 10 according to the present disclosure is not limited to the example illustrated in Fig. 6. For example, the client terminal 10 may be realized by a plurality of devices. Each configuration included in the client terminal 10 is not limited to being integrally provided in one housing, and they may be communicably connected by wire or wirelessly.
Furthermore, the client terminal 10 may be realized by a system configuration including an output device (corresponding to at least the display unit 150 and the audio output unit 160) realized by an HMD or the like and a processing device (corresponding to at least the control unit 120) realized by a smartphone, a tablet terminal, a PC, or the like.
Furthermore, the client terminal 10 may be a non-wearable device such as a smartphone, a tablet terminal, or a PC.
Furthermore, the client terminal 10 may use, as the information regarding the user operation, information received from an external device such as a controller held by the user, a sensor worn by the user, or a sensor disposed around (on the environment of) the user.
Furthermore, the client terminal 10 may be communicably connected to an external display device such as a projector, a TV device, or a display, and display the video of the virtual space by the display control unit 122.
<<3. Connection to virtual space>>
Next, connection to the virtual space by the client terminal 10 will be described with reference to Fig. 7. Here, it is assumed that user registration is completed in advance.
Fig. 7 is a sequence diagram illustrating an example of a flow of connection processing to the virtual space according to an embodiment of the present disclosure. As illustrated in Fig. 7, first, the client terminal 10 activates a client application (step S103). Operation processing of the client terminal 10 described below is executed by a client application. Note that the function of the client application includes the function of the virtual space processing unit 121 described above.
Next, the client terminal 10 requests the user information management server 301 for user authentication (step S106). Specifically, the client terminal 10 transmits the user ID, the authentication information, and the like to the user information management server 301.
Next, the user information management server 301 performs user authentication in response to a request from the client terminal 10 (step S109). In a case where the user authentication is successful, the operation proceeds to the next process. In a case where the user authentication fails, error processing is performed.
Next, the user information management server 301 transmits the user information stored in the data store to the client terminal 10 (step S112). The user information includes a profile of the user, information regarding the user avatar (appearance information, status information), information regarding an item owned by the user, and the like.
Next, the client terminal 10 sets the received user information in the client application (step S115). As a result, the user environment can be obtained regardless of which client terminal 10 the user uses. For example, the latest user environment can be reflected in the use terminal regardless of whether the user accesses from the smartphone or the home PC.
Next, the client terminal 10 inquires of the matching server 303 about the connection destination server (step S118).
Next, the matching server 303 performs a matching process between the client terminal 10 and the connection destination server (virtual space management server 200) (step S121). Specifically, the matching server 303 determines the virtual space management server 200 having a free space as the connection destination server according to the current number of terminals connected to each virtual space management server 200. In addition, the matching server 303 may determine the virtual space management server 200 to which the client terminal 10 corresponding to the user ID has been previously connected as the connection destination server according to the user ID.
Next, the matching server 303 transmits data (for example, an internet protocol (IP) address) indicating the connection destination server to the client terminal 10 (step S124).
Subsequently, the client terminal 10 makes a connection request of the matched virtual space management server 200 (step S127), establishes communication with the virtual space management server 200, and starts bidirectional communication (step S130).
By establishing communication with the virtual space management server 200, the client terminal 10 can receive the virtual space information (map information, virtual objects, information regarding other user avatars, and the like) from the virtual space management server 200 and reproduce the virtual space locally. The reproduction of the virtual space can be performed by disposing the virtual object on the basis of the map information. In addition, the client terminal 10 transmits user information, user operation information, and the like to the virtual space management server 200. The establishment of communication with the virtual space management server 200 may be a login to the virtual space. The client terminal 10 can store data of the connection destination server and log in to the same virtual space again, for example, even when temporarily logging off from the virtual space.
<<4. Display screen example>>
Next, a display screen displayed on the client terminal 10 that has logged in to the virtual space will be described. At the time of login to the virtual space, the user avatar is disposed at a predetermined position in the virtual space (for example, the start area 410 illustrated in Fig. 10). The virtual space processing unit 121 of the client terminal 10 generates a video including the user avatar in the view angle as the video from the user viewpoint, and displays the video on the display unit 150.
Furthermore, the virtual space processing unit 121 may include, in the display screen, a setting screen of the user avatar, various operation buttons for operating the user avatar, a display button of a text chat being performed in the vicinity, a follow and follower display button, a notification display button to the user, a map display button, an owned item display button, and the like.
Fig. 8 is a diagram illustrating an example of a display screen according to an embodiment of the present disclosure. As illustrated in Fig. 8, a video 511 of the virtual space and an operation screen 512 are displayed on the display screen 510. An image of the user avatar, buttons for editing the user avatar and the user profile, an area-specific menu display button 513, and an area common menu button 514 are displayed on the operation screen 512.
The area-specific menu display button 513 is a button for displaying a menu specific to an area (virtual space) where the user avatar currently exists. In an embodiment of the present disclosure, an area specialized for the content is assumed, and an experience menu related to corresponding content is assumed as the area-specific menu. Note that the content here may be an intellectual property (IP) content. In an embodiment of the present disclosure, an area of a specific soccer team is assumed as an example of such an area. Here, an interaction between fans of a specific soccer team is performed. In various places of the area, a logo of the soccer team, a goods store, a virtual stadium in which a stadium existing in the real space is reproduced, and the like are disposed. The user can also move to an area specialized for another content. At this time, the client terminal 10 may newly acquire data of the connection destination server (which provides an area (virtual space) specialized for another IP content) from the matching server 303. Further, one virtual space management server 200 may provide a plurality of different IP content areas. In this case, the client terminal 10 can move the area without switching the connection destination server.
The area common menu button 514 is a menu button that can be used in any IP content area. For example, the avatar matching button is a button for matching with another user avatar in the area. Another user avatar with similar user characteristics may be presented as a recommendation for the interaction partner. In addition, a setting button of an emote which is an action performed by the avatar to further enrich the emotional expression of the avatar, and a stamp which is a small image temporarily displayed above the head of the avatar to express the feeling of the avatar is displayed.
Fig. 9 is a diagram illustrating an example of a display screen according to an embodiment of the present disclosure. As illustrated in Fig. 9, a display screen 520 displays a video of the virtual space and various operation buttons. In the example illustrated in Fig. 9, a user avatar U1 is in front of the stage where a large display 529 is disposed. For example, a real space relay video is displayed on the large display 529. The content distribution server 306 acquires the relay video from an external server (relay video distribution server), and distributes the relay video to each client terminal 10 in real time.
As an example of the operation buttons, a controller 521 for controlling the movement of the user avatar in the virtual space (area) is displayed at the left end of the display screen 520, and a jump icon 522 is displayed at the right end of the display screen 520.
When the user slides the cursor displayed at the center of the controller 521 up, down, left, and right, the virtual space processing unit 121 causes the user avatar to walk forward, backward, left, and right in accordance with the sliding direction. Note that, in a case where the slide amount or the slide speed is equal to or more than the setting value, the virtual space processing unit 121 causes the user avatar to run forward, backward, left, and right. When the jump icon 522 is tapped, the user avatar jumps. Such user operation information regarding the motion such as the movement or the jump of the user avatar is transmitted to the client terminal 10 of each of the other users via the virtual space management server 200 in real time, and is reflected in the user avatar disposed in the virtual space reproduced in the client terminal 10 of each of the other users. As a result, the motion of the user avatar is shared with the other users.
Furthermore, the user can move the user's line-of-sight in the virtual space by dragging an any place (a place other than that of the operation button) on the display screen 520 up, down, left, and right with one finger or the like. Furthermore, the user can reduce or enlarge the video of the virtual space by pinching in or pinching out an any place (a place other than that of the operation button) on the display screen 520 with two fingers.
A microphone icon 523 is a button for switching input ON and OFF of the audio of the user according to a tap operation. When the microphone icon 523 is in the input ON state, the audio of the user is transmitted to another user via the virtual space management server 200 or the voice chat server 305. The voice chat may be performed with another user avatar located around the user avatar, may be performed with a specific another user avatar permitted by the user, or may be performed between participating users of the voice chat group.
A text input icon 524 is a button for accepting input of a text chat. When the text input icon 524 is tapped, a text input UI (user interface) is displayed, and the user inputs the text into the text input UI. The input text is displayed as a balloon image T1 above the head of the user avatar U1 as illustrated in Fig. 9, for example. A balloon image T2 is also displayed above the head of another user avatar U2 near the user avatar U1. As illustrated in Fig. 9, a 3D representation may be used to display the balloon images of other user avatars U2 and U3 located at both sides of the user avatar U1. Furthermore, the virtual space processing unit 121 may blank the balloon images of other user avatars U4 and U5 located at positions distant from the user avatar U1 so as not to be visually recognized by the user. Note that the text chat is not limited to a method via a balloon image displayed above the head of the user avatar. The virtual space processing unit 121 may display the surrounding chat window in response to a tap operation of a surrounding chat window display switching button 527. The surrounding chat window is a screen that displays a text chat between user avatars around the user avatar (for example, in a section where the user avatar is located). As a result, the user can perform text chat with another user in the vicinity. Note that the text chat can be performed via the virtual space management server 200 or the text chat server 304.
A stamp icon 525 is a button for displaying a stamp menu screen. The user can select an any stamp from the stamp menu screen displayed by tapping the stamp icon 525 and display the stamp above the head of the user avatar U1. An emote icon 526 is a button for displaying an emote menu screen. The user can select an any emote from the emote menu screen displayed by tapping the emote icon 526, and can operate a facial expression, a pose, a gesture, and the like of the user avatar U1.
Although the various operation buttons have been described above with reference to Fig. 9, the arrangement and functions of the various operation buttons are merely examples, and the present disclosure is not limited thereto.
<<5. Specific example of virtual space>>
<5-1. IP content area>
Examples of the virtual space according to an embodiment of the present disclosure include an area specialized for the IP content. For example, the area may be an area with an interaction between fans of a specific soccer team as a concept. In this case, for example, a logo of the soccer team, a goods store, a virtual stadium reproducing a stadium existing in the real space, and the like are disposed at various places in the area. Fig. 10 is a diagram illustrating an example of a simple map of an IP content specific area according to an embodiment of the present disclosure. As shown in Fig. 10, an area 400 includes a start area 410 in which a user avatar is first placed when entering the area, a goods store 420, a stadium 430, and a bar 440. The area 400 according to an embodiment of the present disclosure may be a mirror world in which respective virtual facilities (the goods store 420, the stadium 430, and the like) to be disposed are similar to those of the real space. The user can enjoy an atmosphere around the actual stadium by moving in the area 400 with the user avatar.
Furthermore, in the stadium 430, a game of a soccer team is played back as an example of the content distribution in the virtual space. The game video may be displayed on a large display disposed in the stadium 430 (streaming distribution of 2D moving images), or a soccer player and a ball may be reproduced by a 3DCG, and each soccer player and the ball may be moved on the basis of motion data in an actual game to reproduce the game.
Fig. 11 is a diagram for describing a mechanism of game reproduction using bone data of a player according to an embodiment of the present disclosure. As illustrated in Fig. 11, first, the specification information regarding the distribution content is transmitted from a virtual space management server 200a to each client terminal 10. The distribution content specification information is information indicating the game content reproduced in the stadium 430.
Each client terminal 10 requests the content distribution server 306 of the backend system 30 to distribute the game content on the basis of the specification information received from the virtual space management server 200a.
Next, the content distribution server 306 requests a game data server 50, which is an external device, to distribute the game content. The game data server 50 is a device that stores tracking data of each player and the ball collected by a tracking system 51 in a game performed in the real space. The tracking data of each player is, for example, bone data. The game data server 50 may store the 3DCG of each player. The game data server 50 generates game content data after adjusting the format of the tracking data acquired from the tracking system 51 and appropriately deleting unnecessary data. The game data server 50 transmits the game content data to the content distribution server 306 in response to a request from the content distribution server 306.
Next, the content distribution server 306 transmits the game content data to each client terminal 10. The game content data includes a 3DCG (also referred to as a player object) of each player, bone data of each player in the game, a 3DCG of the ball, tracking data of the ball in the game, and the like. On the basis of the game content data, the client terminal 10 moves the player object in the stadium 430 on the basis of the bone data, and can more realistically reproduce the game in 3D. The user can operate the user avatar to enter the stadium 430 and enjoy the game in the stadium 430. Furthermore, the user can move the user avatar into the field (here, in the soccer coat,) and watch the movement of each player from various angles nearby. The content distribution server 306 can also synchronize the timing of the game content data reproduced in each client terminal 10 (the playback time of the game reproduced in the stadium 430). A UI when reproducing a game in the virtual space using the tracking data of the game performed in the real space will be described later with reference to Figs. 27 to 32.
<5-2. Talking table>
In an any place in the virtual space, a communication point associated with a communication room in which communication between users is performed is set. The communication point may be set by the virtual space management server 200 or by the user.
In the communication room, communication by one or more different communication means can be performed. The one or more different communication means are, for example, communication means using information such as an audio, a character, or an image. The communication room may be provided by the backend system 30 (for example, the talking table server 307). The user can enter the communication room (participate in communication) by moving the user avatar to the communication point where the talking table as the virtual object is displayed. When entering the communication room, the display screen transitions to a screen of the communication room. Furthermore, communication performed in the communication room can be performed by corresponding communication servers (for example, the text chat server 304 and the voice chat server 305).
A virtual object serving as a mark may be disposed at the communication point. In an embodiment of the present disclosure, a table is used as an example of a virtual object serving as a mark. Such a table is also referred to as a talking table in the present specification.
The talking table is disposed at an any place (communication point setting location) in the virtual space by an input operation to the client terminal 10 operated by the user or the virtual space management server 200. As an example, it may be disposed in the bar 440 described above with reference to Fig. 10. Fig. 12 is a diagram illustrating a display screen inside a bar at which a talking table 600 according to an embodiment of the present disclosure is disposed. A display screen 530 illustrated in Fig. 12 is an example of a video from the user's line-of-sight when the user avatar moves to the inside of the bar 440. As described above with reference to Fig. 10, the bar 440 is provided as a place where users (here, fans of a soccer team) interact with each other. As illustrated in the display screen 530 of Fig. 12, a plurality of talking tables 600 is provided inside the bar 440. A talking room is associated with each of the talking tables 600. The talking room is provided by the talking table server 307.
The shape of the talking table 600 is an example. When the user brings the user avatar close to the talking table 600, information regarding selection of entry to the talking room associated with the talking table 600 is transmitted from the virtual space management server 200 to the client terminal 10. The user can cause the user avatar to enter (in other words, participate in the talking table) the talking room (communication room) by performing the operation of the room entry request on the client terminal 10.
The talking table 600 is displayed at a communication point associated with a communication room in which communication between users is performed. The communication point is set in a specific region on the virtual space, and in a case where the avatar is included in the communication point, information regarding the room entry selection is transmitted.
Fig. 13 is a diagram for explaining an example of the shape of the talking table 600. As illustrated in Fig. 13, for example, the talking table 600 includes a table 601 and a display 602. The display 602 is an example of a display object that displays “information regarding an event taking place in a real space or a virtual space” that is a subject of conversation in a talking room (in a communication room) associated with the talking table 600. The subject of conversation in the talking room may be preset in the virtual space management server 200, or may be a current topic in the talking room or a video currently viewed in the talking room (for example, a topic window 542 illustrated in Fig. 20). The display content of the display 602 is information regarding the subject of conversation in the talking room, and may appropriately change according to a change in the subject of conversation in the talking room. The display 602 and the table 601 are disposed in the virtual space as virtual objects.
For example, in a case where “a game held most recently in the real space” is set as a subject of conversation in the talking room associated with the talking table 600, information regarding the game such as a video of the game, an interview video of a player who participated in the game, or a game result of the game is displayed on the display 602. The 2D moving image displayed on the display 602 may be a moving image streamed from the content distribution server 306. The specification of the 2D moving image displayed on the display 602 can be performed by the virtual space management server 200. The virtual space management server 200 may specify the 2D moving image to be displayed on the display 602 according to the IP content associated with the area. The same 2D moving image may be played back at the plurality of talking tables 600 disposed inside the bar 440.
The talking table 600 may further include a display 603 that displays information regarding the talking table 600. For example, the display 603 displays the identification number of the talking table 600 and the number of people currently entering the talking room. By visually recognizing the display 603, the user can check how many people can enter the talking room and how many people are currently in the talking room.
(Method of entry)
By approaching the talking table 600, the user can select whether to participate (enter) as a speaker or participate (enter) as an audience. The “participation as a speaker” is participation in a state in which speech (for example, voice chat or text chat) can be made in the talking room. The participation as the audience is participation in a state in which speech in the talking room is not allowed but viewing in the talking room is allowed.
When the user avatar approaches the talking table 600, the virtual space management server 200 transmits, to the client terminal 10, information regarding room entry selection for the user to select whether or not to enter the talking room associated with the talking table 600. The information regarding the room entry selection is information indicating that the user can enter (participate in) the room in either form of a speaker or an audience. When the user operates the room entry (participation) request in the client terminal 10, the user can select whether to participate in either form of a speaker or an audience.
On the basis of the information regarding room entry selection received from the virtual space management server 200, for example, in a case where the user avatar moves within a range of a certain distance from the position of the talking table 600, the virtual space processing unit 121 displays a selection screen for allowing the user to select whether to enter the talking room as a speaker or as an audience. Note that the participation/non-participation as the audience may be set at the time of creating the talking room. When entry as a speaker or an audience is selected, an entry request is transmitted from the client terminal 10 to the virtual space management server 200, and each session information (connection information) corresponding to the talking table 600 is returned from the virtual space management server 200. The client terminal 10 can establish communication connection with the talking table server 307, the text chat server 304, the voice chat server 305, and the like on the basis of the received section information.
Note that each session information (connection information) corresponding to the talking table 600 may be included in advance in the information regarding the room entry selection. As a result, the client terminal 10 issues a room entry request to the virtual space management server 200 in response to an operation of room entry selection by the user, and can establish communication connection with the talking table server 307, the text chat server 304, the voice chat server 305, and the like on the basis of each section information.
Furthermore, in a case where there is a plurality of talking tables 600, the user may not know which of the talking tables 600 the user can participate in to perform desirable communication. Therefore, for example, the following method can be considered.
As an example, there is a method of varying the level of disclosure of the state in the talking room according to the distance between the talking table 600 and the user avatar. Fig. 14 is a diagram for describing a case where the state in the talking room is disclosed according to the distance to the talking table 600 according to an embodiment of the present disclosure.
As illustrated in Fig. 14, in a case where the virtual space management server 200 determines that the user avatar is located within an area E1 (second area) within a first distance (within the first distance centered on the communication point) from the position of the talking table 600, the virtual space management server 200 transmits information regarding communication of the talking table 600 to the client terminal 10. As an example of the information regarding communication of the talking table 600, there is audio information in the communication room associated with the talking table 600. As a result, the user can grasp the amount of conversation, atmosphere, and the like in the talking room without entering the talking room. In the example illustrated in Fig. 14, the users of the user avatars U4 and U5 are in a state of hearing a conversation in the talking room. Furthermore, the virtual space management server 200 may instruct the client terminal 10 to increase the volume so that the conversation in the talking room can be heard as the user avatar approaches the center (the talking table 600).
On the other hand, as illustrated in Fig. 14, in a case where the virtual space management server 200 determines that the user avatar is located in an area E2 (first area) within a second distance shorter than the first distance from the position of the talking table 600, the virtual space management server 200 transmits, to the client terminal, information regarding selection of entry to the talking room associated with the talking table 600. The user can perform an operation of the room entry request on the client terminal 10. At this time, the user can select whether to participate in (enter) the talking room in either form of a speaker or an audience (this can also be said to be a user input for information regarding room entry selection).
When the user enters the talking room, the virtual space management server 200 may transmit, to the client terminal 10, a conversation (for example, voice chat information and text chat information) in the talking room, a video (for example, a state of the user avatar illustrated in Fig. 20) in the talking room, a video shared (viewed) in the talking room, or an agenda in the talking room. The agenda may be extracted from an analysis of a voice chat or a text chat in a talking room. Such extraction of the agenda may be performed by the backend system 30. Furthermore, a category of the topic window 542 (see Fig. 20) picked up in a talking room to be described later may be extracted as the agenda. Furthermore, the conversation in the talking room, the video in the talking room (for example, the state of the user avatar illustrated in Fig. 20), the video shared (viewed) in the talking room, or the agenda in the talking room may be transmitted to the client terminal 10 corresponding to the user avatar located in the area E1.
In the example illustrated in Fig. 14, it is assumed that, among the avatars located in the area E2, the user avatar U1 and the user avatar U2 enter the talking room as speakers, and the user avatar U3 enters the talking room as an audience. At this time, the user avatar U4 and the user avatar U5 located in the area E1 can recognize the topic of the conversation being held in the talking room by visually recognizing the display content of the display 602. Furthermore, information regarding communication in the talking room (for example, audio information in the talking room) may be transmitted from the virtual space management server 200 to the client terminal 10 of each of users who operates the user avatar U4 and the user avatar U5. Then, in the client terminal 10 of the user who operates the user avatar that has newly entered the area E2 or the user avatar closer to the talking table 600, a selection screen as to whether or not to enter the talking room is presented.
As described above, in the present system, the virtual space management server 200 arranges, in the virtual space, the virtual object (the talking table 600) including the display object (the display 602) that displays “information regarding the event taking place in the real space or the virtual space” that is the subject of conversation in the talking room (in the communication room). As a result, the user who operates the user avatar existing around the virtual object can recognize the topic of the conversation being held in the talking room by visually recognizing the display content of the display object.
Furthermore, Fig. 15 is a view illustrating a display screen of the terminal (client terminal 10) used by the user who operates the user avatar in a case where the position in the virtual space of the user avatar operated by the user is included in the area E2 (first area) that is a communication point associated with a communication room in which communication between the users is performed, the virtual space management server 200 transmits information regarding selection of entry to the communication room to the terminal (client terminal 10) used by the user to display a selection screen SC on the display screen.
Furthermore, Fig. 16 is a view illustrating a display screen of the terminal (client terminal 10) used by the user who operates the user avatar in a case where the position of the user avatar is included in the area E1 (second area) located outside the area E2 (first area), the virtual space management server 200 transmits information regarding communication to the terminal (client terminal 10) used by the user who operates the user avatar. For example, audio information AI in the communication room associated with the talking table 600 may be output by the client terminal 10.
As a result, while operating the avatar in the virtual space, the user can determine the relationship with respect to the talking room by the avatar operation, such as grasping the topic in the talking room by visually recognizing the display object (display 602), checking the audio in the talking room according to the distance between the user avatar operated by the user and the virtual object (the talking table 600), or participating in the talking room by bringing the user avatar closer to the virtual object by a certain amount or more.
As another example, when the user avatar approaches the talking table 600 within a certain distance, the virtual space processing unit 121 may change the disclosure level of the state in the talking room according to the liking/preference information regarding the user. For example, in a case where the conversation (agenda) in the talking room matches the user's liking/preference, the virtual space processing unit 121 may play back the conversation in the talking room with a large sound or display a more detailed agenda on a tab display 605. On the other hand, in a case where the conversation (agenda) in the talking room does not match the user's liking/preference, the virtual space processing unit 121 may play back the conversation in the talking room with a small sound or may not play back the conversation, or may not display the agenda on the tab display 605.
Furthermore, as another example, the virtual space processing unit 121 may highlight the talking table 600 of the agenda matching the user's liking/preference. Furthermore, the virtual space processing unit 121 may highlight the talking table 600 of the agenda that the user has mentioned in the conversation so far.
Furthermore, as another example, the virtual space processing unit 121 may highlight the talking table 600 in which another user estimated to have liking/preference matching with the user participates.
Furthermore, as another example, the virtual space processing unit 121 may present an effect below the table according to the excitement in the talking room. Fig. 17 is a diagram illustrating an example of effect display of the talking table 600 according to the excitement in the talking room according to an embodiment of the present disclosure. As illustrated in Fig. 17, the virtual space processing unit 121 may present a write effect below the table 601 included in the talking table 600. The virtual space processing unit 121 may control the color, blinking, density, and the like of the effect according to the excitement in the talking room. As a result, the user can grasp the excitement in the talking room.
Furthermore, the virtual space processing unit 121 may change the shape of the talking table 600 according to the number of participants in the talking table 600 (the number of people entering the talking room). For example, the table may be changed to a larger table as the number of participants increases.
In addition, in the present system, there is no particular restriction on entering the talking room, and anyone can freely enter the talking room. However, a function that allows a person entering the talking room to optionally leave another participant may be provided.
Furthermore, in the present system, the host of the talking room may not be particularly set, and the user who enters the room first may be used as the host. In a case where the talking room is created by a request from the user, the creator may be a host. The host may restrict entry to the talking room, or the host may permit entry to the talking room. The setting of the host can be performed by the talking table server 307 that manages the talking room. In addition, only the host may be given the authority to leave the another participant. Furthermore, the authority of the host may be arbitrarily transferable to another participant.
Furthermore, the virtual space management server 200 may display a specific scene of a game taking place in the real space or the virtual space on the display 602 as information regarding the event in the real space or the virtual space. The information regarding the event taking place in the real space is not limited to the specific scene of the game, and is only desirable to be, for example, a specific scene of an event such as a specific scene of a music live show performed in the real space. In the case of an event taking place in the virtual space, it may be a specific scene of an event such as a music live show or an e-sports tournament performed in the virtual space.
The specific scene of the event may be, for example, a highlighted video of a game in real space. The virtual space management server 200 may perform control to display the highlighted video of the game taking place in the real space on the display 602 (display object) as the information regarding the event in the real space. As the highlighted video of the game, for example, a goal scene, a foul scene, or the like is assumed. Furthermore, the specific scene of the event may be a highlighted video of the event taking place in the virtual space.
Furthermore, the virtual space management server 200 may use only the information regarding the team that the specific user group supports in the event as the “information regarding the event in the real space” to be displayed on the display 602 (display object). For example, it may be a highlighted video of a game by a specific team. As a result, the user can grasp about which team communication is performed in the talking room associated with the talking table 600 including the display 602 before entering the room. In the present system, for example, only information regarding a specific team is displayed on the display 602, so that fans of the specific team can gather in the talking room.
Furthermore, the virtual space management server 200 may transmit the information regarding the room entry selection to the talking room to only users belonging to a specific user group in the event among the users who operate the user avatars approaching the talking table 600. As a result, in the present system, it is possible to limit the user group who can enter the talking room.
(Operation processing)
Fig. 18 is a sequence diagram illustrating an example of operation processing from generation of the talking table 600 to entry according to an embodiment of the present disclosure.
As illustrated in Fig. 18, first, the talking table server 307 generates a talking room in response to a request for generating a talking room (step S203) from the virtual space management server 200 (step S206). Then, the talking table server 307 transmits the generated session information to the talking room to the virtual space management server 200 (step S209). The virtual space management server 200 associates the session information with the communication point set in the virtual space. Note that, here, the generation request from the virtual space management server 200 will be described as an example, but the present disclosure is not limited thereto, and the client terminal 10 may make a generation request. In this case, the talking table server 307 transmits the generated session information to the talking room to the client terminal 10. The client terminal 10 associates the session information with the communication point set in the virtual space. Furthermore, the client terminal 10 transmits the communication point and the session information set in the virtual space to the virtual space management server 200 and shares the communication point and the session information with other users.
Next, the virtual space management server 200 transmits a text chat room generation request to the text chat server 304 (step S212). The text chat server 304 generates a text chat room (step S215), and transmits session information to the text chat room to the virtual space management server 200 (step S218).
Next, the virtual space management server 200 transmits a voice chat room generation request to the voice chat server 305 (step S221). The voice chat server 305 generates a voice chat room (step S224), and transmits session information to the voice chat room to the virtual space management server 200 (step S227).
Next, the virtual space management server 200 stores each session information (step S230).
Next, when the user avatar moves close to the talking table 600 (within a certain distance from the talking table 600) and entry to the talking room (and a form of entry) is selected by a user operation (when a user input for information regarding entry selection is performed), the client terminal 10 transmits an entry request to the virtual space management server 200 (step S233). Note that, when the user avatar moves close to the talking table 600, information regarding selection of entry to the communication room may be transmitted from the virtual space management server 200 to the client terminal 10. The client terminal 10 may display the room entry selection screen on the basis of the information regarding the room entry selection.
Next, the virtual space management server 200 recognizes the intention of the user to enter the room by receiving the room entry request from the client terminal 10 (step S236), and transmits each session information (connection information) corresponding to the talking table 600 to the client terminal 10 (step S239). Furthermore, the virtual space management server 200 may cause the client terminal 10 to transition to a talking room screen (an example of a communication room screen) when the user enters the talking room.
Note that, here, as an example, it is described that an entry request is transmitted to the virtual space management server 200 in a case where the user selects entry, but the present disclosure is not limited thereto. As described above, in a case where the client terminal 10 performs control to disclose the state in the talking room according to the distance to the talking table 600 of the user avatar or the like, the client terminal may request the virtual space management server 200 to transmit each session information before obtaining the intention of the user to enter the room. Furthermore, since the client terminal 10 continuously transmits the position of the user (the position of the user avatar) to the virtual space management server 200, in a case where the user avatar is positioned within a certain distance from the talking table 600 (communication point), the virtual space management server 200 may recognize that there is an intention to enter the room to transmit each session information to the client terminal 10. Alternatively, the virtual space management server 200 may include each session information corresponding to the talking table 600 in the information regarding the selection of entry to be transmitted to the client terminal 10. The client terminal 10 may cause the user to select whether or not to enter the room after receiving each session information from the virtual space management server 200. As a result, the client terminal 10 can perform control to disclose the state in the talking room according to the distance between the talking table 600 and the user avatar before obtaining an intention to enter the talking room from the user.
Then, the client terminal 10 displays the talking room screen, and starts communication connection with each server on the basis of each session information (step S242). Details of the communication connection with each server will be described next with reference to Fig. 19.
Fig. 19 is a diagram illustrating details of communication connection between the client terminal 10 and each server at the time of entering the talking room according to an embodiment of the present disclosure. As illustrated in Fig. 19, each client terminal 10 that has entered the talking room performs bidirectional communication with the talking table server 307 to acquire room entry user information. The room entry user information includes the user ID of the users entering the room, the user name, information regarding the user avatar, and the like.
Each client terminal 10 performs bidirectional communication with the text chat server 304 to transmit and receive text chat. This allows a text chat to be performed in the talking room.
Each client terminal 10 performs bidirectional communication with the voice chat server 305 to transmit and receive a voice chat. As a result, a voice chat can be performed in the talking room.
Each client terminal 10 performs bidirectional communication with the virtual space management server 200a to transmit and receives an emote, a stamp, and the like in real time. This controls the emote and the stamp of each user avatar displayed in the talking room in real time.
Each client terminal 10 is communicably connected to the content distribution server 306, and receives streaming distribution of the content (moving image or the like). As a result, the content (moving image or the like) can be viewed in the talking room. Note that the content specification information can be received from the talking table server 307 or the virtual space management server 200a. The content may be set by the talking table server 307 or the virtual space management server 200a, or may be set by the users entering the room.
(Example of talking room screen)
Fig. 20 is a diagram illustrating an example of a talking room screen according to an embodiment of the present disclosure. When the user participates in (that is, entering the talking room) the talking table, the virtual space processing unit 121 of the client terminal 10 causes a screen to transition from a display screen as illustrated in Fig. 14 to a talking room screen 540 as illustrated in Fig. 20.
On the talking room screen 540, the user avatars entering the room are displayed side by side. In addition, various operation buttons (a stamp icon 546, an emote icon 547, a stamp icon 548, a text input button 549, a microphone ON-OFF button 550, a menu button 541) and the topic window 542 are displayed.
Above the head of each user avatar, a balloon image 544 displaying a text input and a stamp image 545 selected by the user may be displayed. The virtual space processing unit 121 may display the balloon image 544 and the stamp image 545 only for several seconds so that the log is not viewed. Furthermore, in each user avatar, a facial expression, a pose, a gesture, or the like (applause, cheers, greetings, a gaze pose, a gesture of depression, and the like) is controlled according to the emote selected by the user.
The topic window 542 is a virtual object that displays the content viewed in the talking room. There may be a plurality of topic windows 542. Each user may slide a plurality of topic windows 542a to 542c to select the content to pick up. Within the talking room, playback of the picked-up topic window 542 (for example, the centrally located window) begins. Note that the right to pick up (channel right) can be given to all users entering the room.
The category of the content displayed in the topic window 542 is assumed to be, for example, news related to IP content (here, a specific soccer team) (top 5 page views (PV) in the last 3 days, etc.), a reverberant video among moving images related to IP content (video with large number of views, video with excitation of user, etc.), a highlighted video of the latest game, a battle history, and the like. Furthermore, in the topic window 542, it is also possible for the user entering the room to display private photographs and videos or share the user's local screen. Furthermore, the category of the content displayed in the topic window 542 may be determined by the virtual space management server 200 or the talking table server 307 according to the place where the talking table 600 is installed or the current time.
Fig. 21 is a diagram illustrating another example of the talking room screen according to an embodiment of the present disclosure. As in a talking room screen 560 of Fig. 21, only a picked-up topic window 561 may be displayed in a large size, and a text chat window 562 may be displayed. The display ON-OFF switching of the text chat window 562 can be performed by a tap operation on a display switching button 564. In addition, ON-OFF switching of the microphone in the talking room can be performed by a tap operation on a voice switching button 565.
Although the example of the talking room screen is described above, the talking room screen according to the present disclosure is not limited to the example illustrated in the drawings.
<5-3. Event coordination>
Next, the control of performance (including the virtual event) in the virtual space according to the development of the event taking place in the real space, where the performance is performed in cooperation with the event coordination server 308, will be described. The distribution video of the event taking place in the real space may be disposed in the virtual space. Furthermore, the user may view the distribution video of the event taking place in the real space outside the virtual space, such as a television or a window different from a window in which the virtual space is displayed. Furthermore, the performance in the virtual space is determined according to the development of the event taking place in the real space and the reactions of the plurality of users.
Fig. 22 is a diagram illustrating a mechanism of event coordination according to an embodiment of the present disclosure. In the example illustrated in Fig. 22, a soccer game performed in real time in real space is assumed as the predetermined event.
First, the event coordination server 308 of the backend system 30 acquires real time game data at appropriate times and analyzes the game development. In addition, the event coordination server 308 identifies whether the game is to be developed before, during, or after the game as the game development. For example, the event coordination server 308 may acquire the text of the game development as game data from a service that livestreams the game development as text, and automatically extract a specific word. Examples of the service of distributing the game development as text live include a service of distributing the comment (audio) of the commentator of the game relay in text. Examples of the specific word to be automatically extracted include “free kick”, “foul”, “missing shoot”, “shooting stopped”, “corner”, “free kick by the enemy”, “goal by *** (enemy team)”, “start of half time”, “player change”, “start of second half”, “yellow card”, “off side”, “goal by *** (ally team)”, and “end of game”. The text of the game development may be distributed independently on the official site of each team.
The analysis of the game development is not limited to the above-described example, and for example, the event coordination server 308 may perform image recognition on the relay video of the game in real time and determine the play.
Next, the virtual space management server 200a specifies corresponding predetermined performance according to the analysis result of the game development acquired from the event coordination server 308 at appropriate times and the reactions of the plurality of users, and instructs each client terminal 10 to immediately carry out the performance in the virtual space. In an embodiment of the present disclosure, as an example, performance according to the development of a game performed in the real space can be carried out in real time in the stadium 430. As a result, in the present system, the users who gather in the stadium 430 existing in the virtual space can enjoy watching the game in the virtual space by experiencing performance in the stadium 430 with other users according to the game development and reactions of a plurality of users. Note that the virtual space management server 200a may acquire the distribution video of the game taking place in the real space from the backend system 30 (for example, the content distribution server 306), transmit the distribution video to each client terminal 10, and instruct the each terminal to arrange the distribution video in the virtual space (specifically, in the stadium 430,). In addition, each client terminal 10 may acquire the distribution video of the game specified by the virtual space management server 200a from the content distribution server 306. The user may view the performance in the stadium 430 from the user viewpoint in the virtual space by the client application activated on the client terminal 10 such as the smartphone or the tablet terminal while viewing the live broadcast of the game on the external device such as the television device or the PC. The client terminal 10 may display the live broadcast of the game in one window displayed on the display unit 150, and may display the video of the virtual space (the state of performance in the stadium 430) in another window displayed on the display unit 150.
In addition, the virtual space management server 200a acquires the reaction of the user from each client terminal 10, estimates the game development on the basis of the content, and determines the corresponding performance. Examples of the user's reaction include user operation information (stamp, the number of emotes performed, type, and the like). In addition, the user's reaction may be information such as the number of voice chat transmissions, a content, a frequency, and an audio volume. In addition, the user's reaction may be information such as the number of text chat transmissions, a content, and a frequency. The virtual space management server 200a can also estimate the game development from the number of transmissions of voice chat or text chat, or the like. Note that the estimation of the game development based on the reaction of the user may be performed by the event coordination server 308.
Examples of the performance corresponding to the game development include performance in which a DJ appears on a stage disposed in a field to excite the venue with music before the game, display of a video during practice or moving of the team before the game on a large display disposed in the field, and display of a relay video of a scene appearing in the game venue. In addition, the starting member announcement performance may be carried out using a large display disposed in the field.
In addition, examples thereof can include a kickoff performance at the start of the game, and during the game, a foul performance of an own team player or a counterpart team player, a lost score performance (performance to encourage players by the avatar of the team character appearing), a score performance (performance of joy by the enjoying avatar of team character while playing music, setting off fireworks, and causing confetti and balloons to appear), a lottery event performance in halftime (present is dropped), a profile display performance of a star player (displayed on a large display disposed in the field), a yellow card performance of an own team player or a counterpart team player, and an off-side performance of an own team player or a counterpart team player. In addition, examples thereof can include a game end performance at the end of the game, and a victory performance (a performance in which a DJ appears to excite a venue with music) or a defeat performance after the game.
In addition, when the reaction of the user exceeds a certain number, the virtual space management server 200a may execute the corresponding performance according to the development of the game and the reaction of the plurality of users. For example, when many users (more than a certain number of or more than a certain percentage of users) press “GO” stamps, the virtual space management server 200a executes, as a support performance, a performance for causing a non-player character (NPC) disposed in a field to perform a cheering emote or a performance for playing back music for support. Furthermore, the virtual space management server 200a may set the number of reactions of the user as excitement, and increase the size and the number of balloons to be blown to the field according to the magnitude of the excitement. Furthermore, the virtual space management server 200a may increase the volume of the music for support played back in the field according to the excitement of the user. Furthermore, in a case where the excitement of the user continues, the virtual space management server 200a may extend the length of the support performance from the predetermined time. Furthermore, the virtual space management server 200a may determine the performance on the basis of the reaction having the largest amount among the reactions of the user.
The virtual space management server 200a may acquire the reaction of the user who operates the user avatar existing in the virtual space managed by the virtual space management server 200a as the reaction of the user. The user's reaction is, for example, user operation information (stamp, the number of emotes performed, type, and the like). In addition, the user's reaction may be information such as the number of voice chat transmissions, a content, a frequency, and an audio volume. In addition, the user's reaction may be information such as the number of text chat transmissions, a content, and a frequency.
Furthermore, the virtual space management server 200a may further acquire, as the user's reaction, a reaction of the user who operates the user avatar existing in another virtual space managed by another virtual space management server that carries out performance according to the game development targeting a game same as a game targeted by the virtual space management server 200a. Such a user's reaction may also be user operation information, voice chat information, or text chat information. Note that, in a case where the virtual space management server 200a transmits the distribution video of the game to the client terminal 10a, the virtual space management server 200a may further acquire a reaction of the user who operates the user avatar existing in another virtual space managed by another virtual space management server transmitting the same distribution video.
Furthermore, the reaction of the user is not limited to the reaction of the user operating the user avatar, and may be a reaction of an audience watching a game on site in real space or a reaction of a viewer watching a game on a television or the like. Such a reaction may be, for example, a facial expression, a motion, a cheer, or the like. Furthermore, such a reaction is sensed by various sensors such as a camera, an acceleration sensor, and a microphone. The virtual space management server 200a can appropriately acquire information regarding a reaction of the audience or a reaction of the viewer in real time.
Furthermore, the virtual space management server 200a may determine the emote of the avatar existing in the virtual space as the performance to be determined according to the development of the game and the reactions of the plurality of users. For example, in a case where the virtual space management server 200a identifies that the game is being played as the game development, and further identifies that many users are performing support to encourage the game such as “GO” stamp selection, the emote in which the plurality of avatars existing in the virtual space waves a towel to support the team and the like reflected. At this time, the virtual space management server 200a may reflect the emote on the avatar in the non-operation state in the virtual space, may reflect the emote on the NPC, or may reflect the emote on all the avatars. Furthermore, the virtual space management server 200a may reflect the remote on all the avatars in the non-operation state and all the NPCs, or may reflect the remote on some of the avatars in the non-operation state and some of the NPCs.
(Specific example of event coordination)
The game development largely includes before the game, during the game (first half), half time, during the game (second half), and after the game. Hereinafter, specific examples of each performance in a large flow of game development will be sequentially described.
Before game
45 minutes before the game, the start of a live stadium (a performance event that is carried out in accordance with a real-time development of a game in real space in the stadium 430. In an embodiment of the present disclosure, it is also referred to as a “live stadium”) is notified to each user avatar. The user avatar in the area moves to the stadium 430.
The user enters the stadium 430, joins another user avatar (another user registered as a friend), and waits for the start of the game while talking with a friend in a group chat, for example.
A DJ appears on a stage disposed in a field of the stadium 430, and a performance that excites the venue with music is carried out. In addition, a large display may be disposed above the stage. On the large display, a practice video of a team before a game or a video of an arrival state of a team bus is displayed. Fig. 23 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. As illustrated in Fig. 23, on a display screen 710, the starting member information, the video of the state of the team before the game, and the like are displayed on a large display 711 disposed in the field (pitch) in the stadium in the virtual space.
Furthermore, in the field, a performance of a starting member announcement performance and a performance of making avatars of starting member players appear on the stage may be carried out. The users gathering in the field are further excited. Next, when players enter the game venue in real space, a relay of the players entering the game venue is displayed on the large display 711, and the excitement of the users gathering in the field is the highest.
During game
During the game, the large display 711 displays a performance corresponding to a game situation, or a reaction of the users gathered in the field, or a performance prompting the users to make a reaction. Fig. 24 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. On a display screen 720 illustrated in Fig. 24, display windows 721 and 722 of the large display 711 and a text chat window 723 are displayed on the front. Such display switching can be appropriately performed by a user operation. The user can use the text chat window 723 to text chat with other users in the area. Furthermore, the user can also perform voice chat with a member in a group generated with other users registered as friends. These text chats and voice chats may be done using the backend system 30.
In the display window 721, an avatar video of a team character who appears on a stage disposed in a field of the virtual space and excites the venue is displayed. In addition, in the display window 722, the same support stamp as the support stamp selected by the majority of users in the field is displayed, or a support stamp prompting the user to select is displayed. With such performance, the user can have a sense of unity, and the venue can be excited. The virtual space processing unit 121 may appropriately play back music or audio in accordance with a performance instruction from the virtual space management server 200. The support stamp includes, for example, NICE, BOO, GOAL, and the like.
Fig. 25 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. A display screen 730 illustrated in Fig. 25 illustrates an example of the performance for lost score. In a case where the enemy team has scored a goal, for example, a performance in which the team character takes a depressed pose is carried out. Furthermore, when the user taps a support stamp button 450 of “OMG”, the support stamp of “OMG” is displayed above the head of the user avatar U. In a case where many users have selected the support stamp of “OMG”, a performance in which the large display 731 display the support stamp of OMG! is displayed is carried out.
Furthermore, the user can cause the user avatar U to perform the same cheering emote as the team character. Fig. 26 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. As illustrated in Fig. 26, when a support button 742 of synchronization (SNYC) illustrated on a display screen 740 is selected, the virtual space processing unit 121 causes the user avatar U to perform the same cheering emote as a team character 741. As a result, the user in the venue can perform the cheering emote together with the team character and the other users.
In the half time, for example, performance of a lottery event is performed. Fig. 27 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. As illustrated in a display screen 750 of Fig. 27, for example, when a large number of boxes fall from the sky and the user operates the user avatar U and picks up the box, a present such as a sign ball or a uniform is hit by lottery.
When the ally team scores a goal during the game, a score performance is carried out. In the score performance, for example, music is played, fireworks are set off, confetti and balloons fall, and a team character dances. Fig. 28 is a diagram illustrating an example of a display screen in the live stadium according to an embodiment of the present disclosure. As illustrated in a display screen 760 of Fig. 28, on the field, a performance in which fireworks are set off and confetti and balloons fall is carried out. Furthermore, on a large display 761, information regarding a player who has scored a goal, information regarding a player of the opposing team, scores, a support stamp, a video of a team character, and the like are displayed. Furthermore, a performance in which a huge avatar 762 of the player who has scored a goal appears on the stage can also be carried out. In this way, a performance that excites the venue is carried out when the team score a goal.
Note that NPCs may be disposed in the spectator seats around the field and on the field, and the performance emote may be performed by the virtual space processing unit 121 as appropriate. For example, the virtual space processing unit 121 causes the NPC to perform a cheering emote when the teammate player is on the offensive, shooting or free kicking. In addition, when the teammate player scores a goal, the virtual space processing unit 121 causes the NPC to perform a joy emote. In addition, when the enemy player fouls, the virtual space processing unit 121 causes the NPC to perform a booing emote. In addition, when the enemy player scores a goal, the virtual space processing unit 121 causes the NPC to perform a despair emote.
Furthermore, in order to produce the excitement of the virtual space, the virtual space management server 200 may instruct each client terminal 10 to cause the user avatar in the non-operation state to automatically carry out a performance such as an emote or a stamp in accordance with the overall performance.
After game
When the game ends, an end-time performance is carried out. In a case where the ally team wins the game, the venue may be further energized by a winning performance (such as reappearance of a DJ again), or in a case where the team loses the game, a performance honoring the players may be carried out. In addition, an interview video of the manager and the player after the game may be displayed on the large display of the field. In addition, a team character may appear on a stage to perform a winning emote or to play a team's cheer song.
Thereafter, users may move to a place where fans can interact with each other, such as the bar 440 described with reference to Figs. 10 and 12, and talk about the game today using the talking table 600. In the talking table 600, today's game video may be shared.
The specific performance according to the game development is described above. Note that the performance according to the present disclosure is not limited to the above-described specific example, and various performances can be considered. Furthermore, here, the performance is carried out according to the game development as an example, but the present disclosure is not limited thereto, and for example, the performance may be carried out according to the development of a music live concert, a lecture, a recital, or the like performed in the real space.
<5-4 Game reproduction UI>
Next, a UI for reproducing a game in the virtual space using the tracking data of the game performed in the real space will be described. As described above with reference to Fig. 11, the client terminal 10 reproduces the game by disposing each player object and the ball in the stadium 430 (see Fig. 10) in the virtual space on the basis of the game content data received from the content distribution server 306, moving each player object according to the bone data of each player tracked during the game, and moving the virtual ball on the basis of the tracking data of the ball.
(Regarding camera position)
When displaying the reproduced game on the display unit 150, the client terminal 10 can display a video from an any camera position among a large number of camera positions. The camera position may be selected by the user or may be automatically selected by the client terminal 10.
Fig. 29 is a diagram illustrating an example of a camera position according to an embodiment of the present disclosure. Fig. 29 illustrates, for example, a camera position 810, a camera position 811, a camera position 812, and a camera position 813.
At the camera position 810: Basic, a virtual camera C is disposed on an extension line connecting the goal and the ball. The camera follows. At the Camera position 811: Bird's eye, the virtual camera C is disposed at a position where a wide view angle can be obtained behind the ball in the sky. These camera positions and camera orientations may be operable by a user.
Each camera position described above is a position at which the player is viewed from above, but the present disclosure is not limited thereto, and the client terminal 10 may set the position (viewpoint) of a specific player as the camera position and provide the subjective video of the player. For example, as illustrated in the lower part of Fig. 29, at the camera position 812: GK, the viewpoint of the goalkeeper is set as the camera position, and a scene seen by the goalkeeper is provided. Furthermore, at the camera position 813: Shooter, the viewpoint of a player who shoots at the end of each scene (also referred to as a chapter) is set as the camera position, and a scene seen by the player is provided.
Fig. 30 is a diagram illustrating a game reproduction UI in the case of the camera position 810: Basic illustrated in Fig. 29. As illustrated in Fig. 30, a game reproduction UI 820 displays the video acquired by the virtual camera C disposed on the extension line connecting the goal and the ball. Note that a screen outline of the game reproduction UI 820 illustrated in Fig. 30 will be described later.
Fig. 31 is a diagram illustrating the game reproduction UI in the case of the camera position 811: Bird's eye illustrated in Fig. 29. As illustrated in Fig. 31, in the game reproduction UI 830, the video acquired by the virtual camera C disposed at a position where the view angle is wide behind the ball in the sky is displayed. Note that the screen outline of the game reproduction UI 830 illustrated in Fig. 31 is common to the game reproduction UI 820 illustrated in Fig. 30.
(Screen outline of game reproduction UI) A screen outline of the game reproduction UI 820 will be described with reference to Fig. 30. As illustrated in Fig. 30, the game reproduction UI 820 displays, for example, an exit icon 821, a notification icon 822, an angle changeable label 823, a camera position switching icon 824, an avatar mode switching icon 825, a highlight label 826, a game progress information display 827, and a display switching icon 828. Note that the arrangement and shape of each display are not limited to the example illustrated in Fig. 30.
The exit icon 821 is an operation button for ending the game reproduction mode by the display control of the player object based on the tracking data (specifically, bone data) or the like of each player according to an embodiment of the present disclosure. The game reproduction mode according to an embodiment of the present disclosure may be highlight reproduction in which reproduction is performed using tracking data of one or more specific scenes in tracking data of one game. Such a game reproduction mode by highlight reproduction performed in the stadium 430 in the virtual space is also referred to as a “highlight stadium” in the present specification.
The notification icon 822 indicates the presence or absence of notification to the user. For example, when a display indicating the number of notifications is displayed on the notification icon 822 and the notification icon 822 is tapped, a notification list is popped up.
The angle changeable label 823 is a display indicating whether or not the user can arbitrarily operate the camera angle. For example, in the case of the camera position 810: Basic and the camera position 810: Basic illustrated in Fig. 29, the camera angle (that is, the line-of-sight direction) is set to be operable by the user. On the other hand, in the case of the camera position 812: GK and the camera position 813: Shooter illustrated in Fig. 29, the camera angle is set to be inoperable by the user. In the case of being operable by the user, the user can change the camera angle in the up, down, left, and right directions by dragging or flicking an any place on the screen up, down, left, and right with one finger or the like.
The camera position switching icon 824 is an operation button for switching a large number of camera positions described with reference to Fig. 29. For example, the user can switch the video of the game reproduction UI 820 to the video of the camera position corresponding to the tapped icon by tapping an any icon among the Basic, Bird's eye, GK, and Shooter icons included in the camera position switching icon 824.
The avatar mode switching icon 825 is an operation button for changing the viewing mode of the game reproduction to the avatar mode. The viewing mode according to an embodiment of the present disclosure includes, for example, a View mode and an avatar mode. In a case where the avatar mode switching icon 825 is not selected, the avatar mode switching icon is displayed in the View mode as a default. In the View mode, the user avatar is not displayed as illustrated in Figs. 28 and 29. In the avatar mode, a user avatar is displayed within the field, allowing the user to manipulate the user avatar to view the game from a free perspective. A display example of the avatar mode will be described later with reference to Fig. 34.
The highlight label 826 is a display indicating that the function currently used by the user in the stadium 430 (see Fig. 10) in the virtual space is the highlight stadium (game reproduction mode of highlight reproduction). Note that, in the stadium 430 in the virtual space according to an embodiment of the present disclosure, the function of the staging event “live stadium” performed in accordance with the development of the real time game in the real space described with reference to Figs. 20 to 26 can also be used. In this case, for example, a label indicating “LIVE VIEW” is displayed on the screen.
The game progress information display 827 is a display indicating the score of the game being reproduced, the opponent, the elapsed time, and the like.
The display switching icon 828 is an operation button for switching the playback indicator display. Each time the display switching icon 828 is tapped, the display of the playback indicator is switched between ON and OFF. A display example of the playback indicator will be described with reference to Fig. 32.
Fig. 32 is a diagram illustrating a display example of the playback indicator according to an embodiment of the present disclosure. In a game reproduction UI 830 illustrated in Fig. 32, a playback indicator 840 is displayed at the lower portion of the screen in response to tapping of the display switching icon 828. Note that the display position of the playback indicator 840 is not particularly limited. The playback indicator 840 includes a seek bar 841, a five second return button 842, a previous chapter jump button 843, a playback and stop button 844, a slow playback button 845, a next chapter jump button 846, a five second send button 847, and a chapter list display button 848.
The seek bar 841 indicates the playback position of the reproduced game. In addition, the game reproduced in an embodiment of the present disclosure includes an aggregate of one or more specific scenes (hereinafter, also referred to as a chapter). Therefore, the seek bar 841 may have a break for respective chapters as illustrated in Fig. 32.
When the chapter list display button 848 is tapped, the chapter list is displayed. Fig. 33 is a diagram illustrating a display example of a chapter list according to an embodiment of the present disclosure. In a game reproduction UI 850 illustrated in Fig. 33, a chapter list 852 is displayed. The chapter list 852 includes a list of one or more specific scenes (chapter) in the game reproduced in an embodiment of the present disclosure. By selecting one of the chapter, the user can jump the playback time of the game video displayed on the game reproduction UI 850 to the playback start time of the selected chapter.
(Avatar mode)
Next, the avatar mode performed when the avatar mode switching icon 825 is tapped will be described with reference to Fig. 34. Fig. 34 is a diagram illustrating a display example of the game reproduction UI in the case of the avatar mode according to an embodiment of the present disclosure.
As illustrated in Fig. 34, a user avatar U is displayed on the game reproduction UI 860 in the avatar mode. The user may operate the controller 521 or the jump icon 522 to move the user avatar U within the field. At this time, the camera position (that is, the user viewpoint) is a position including the user avatar U in the view angle, and follows the movement of the user avatar U. Furthermore, in the avatar mode, since the user can arbitrarily operate the camera angle, the display mode of the angle changeable label 823 changes as illustrated in Fig. 34.
Note that the collision detection between the user avatar U and the virtual object in the field such as the player object and the ball is not performed. In the case of the avatar mode, the client terminal 10 may not be able to perform scene control such as pause or fast forward.
(Highlight playback)
Subsequently, the highlight playback of playing back one or more specific scenes (chapter) of a game by the client terminal 10 will be described with reference to Fig. 35. Fig. 35 is a diagram for explaining a highlight playback of a game according to an embodiment of the present disclosure.
The client terminal 10 first acquires In and Out point information regarding each chapter in one game. For example, the In and Out point information is included in the game content data distributed from the content distribution server 306. In and out point information regarding each scene may be automatically set from the game situation in the game data server 50 in advance.
Next, the client terminal 10 searches for the In point of the scene 1 among the tracking data of the players and the ball included in the game content data and starts playback (game reproduction by moving the player objects), and after performing playback up to the Out point of the scene 1, searches for the In point of the next scene (scene 2) and starts playback of the next scene. Then, the client terminal 10 repeats the above steps until all the scenes (scenes 1 to 4 in the example illustrated in Fig. 35) set in one game are played back.
<<6. Supplement>>
The embodiment of the present disclosure is described above in detail with reference to the accompanying drawings, but the present technology is not limited to such examples. It is apparent that a person having ordinary knowledge in the technical field of the present disclosure can devise various change examples or modification examples within the scope of the technical idea described in the claims, and it will be naturally understood that they also belong to the technical scope of the present disclosure.
It is also possible to create a computer program for causing hardware such as the CPU, the ROM, and the RAM built in the client terminal 10 or the virtual space management server 200 described above to exhibit the function of the client terminal 10 or the virtual space management server 200. Furthermore, a computer-readable storage medium that stores the computer program is also provided.
Furthermore, the effects described in the present specification are not restrictive. That is, the technology according to an aspect of the present disclosure can exhibit other effects apparent to those skilled in the art from the description of the present specification, in addition to the effect above or instead of the effect above.
Note that the present technology can be configured as follows.
(1) An information processing apparatus including:
circuitry configured to:
provide information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed, wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space; and
provide information regarding the communication based on the avatar entering a second area located outside the first area,
wherein the event is a subject of conversation in the communication room.
(2) The information processing apparatus according to 1,
wherein the display object includes information regarding the event taking place in the real space or the virtual space.
(3) The information processing apparatus according to any one of (1) or (2), wherein the display object includes information regarding a team that a specific user group supports in the event.
(4) The information processing apparatus according to any one of (1) to (3),
wherein the display object includes a specific scene of the event.
(5) The information processing apparatus according to any one of (1) to (4),
wherein the specific scene of the event includes a highlighted video of the event.
(6) The information processing apparatus according to any one of (1) to (5),
wherein the circuitry is further configured to:
receive a selection of the information regarding selection of entry to the communication room from the user to enter the communication room; and
provide, based on the selection from the user to enter the communication room, connection information corresponding to the communication room to the user.
(7) The information processing apparatus according to any one of (1) to (6),
wherein the selection from the user regarding the selection of entry to the communication room includes to enter the communication room as a speaker and to enter the communication room as an audience.
(8) The information processing apparatus according to any one of (1) to (7),
wherein the information regarding the communication includes audio information in the communication room, and
wherein the circuitry is further configured to increase a volume of the audio information based on a decrease in distance between the avatar and a center of the virtual object.
(9) The information processing apparatus according to any one of (1) to (8),
wherein the circuitry is further configured to generate the communication room including content viewed in the communication room by the user.
(10) The information processing apparatus according to any one of (1) to (9),
wherein the content includes news related to a team participating in the event.
(11) The information processing apparatus according to any one of (1) to (10),
wherein the content includes a video having a number of views being greater than a predetermined number.
(12) The information processing apparatus according to any one of (1) to (11),
wherein the content includes a video including highlights of the event.
(13) The information processing apparatus according to any one of (1) to (12),
wherein the content includes a chat text window.
(14) The information processing apparatus according to any one of (1) to (13),
wherein the content includes a picked-up topic window.
(15) The information processing apparatus according to any one of (1) to (14),
wherein the circuitry is further configured to perform bidirectional communication with the user based on the user entering the communication room.
(16) The information processing apparatus according to any one of (1) to (15),
wherein the circuitry is further configured to
generate the communication room including an emote icon for displaying an emote menu screen; and
initiate display of an emote in the communication room based on a selection from the emote menu screen by the user.
(17) The information processing apparatus according to any one of (1) to (16),
wherein the circuitry is further configured to
generate the communication room including a stamp icon for displaying a stamp menu screen; and
initiate display of a stamp in the communication room based a selection from the stamp menu screen by the user.
(18) An information processing method including:
providing information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed, wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space; and
providing information regarding the communication based on the avatar entering a second area located outside the first area,
wherein the event is a subject of conversation in the communication room.
(19) An information processing apparatus including:
circuitry configured to:
provide, in a communication room related to a communication point where communication between users is performed, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space,
wherein information regarding selection of entry to a communication room is provided to a user based on an avatar operated by the user entering a first area that is the communication point where communication between users is performed,
wherein information regarding the communication is provided based on the avatar entering a second area located outside the first area, and
wherein the event is a subject of conversation in the communication room.
(20) An information processing method including:
providing, in a communication room related to a communication point where communication between users is performed, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space,
wherein information regarding selection of entry to a communication room is provided to a user based on an avatar operated by the user entering a first area that is the communication point where communication between users is performed,
wherein information regarding the communication is provided based on the avatar entering a second area located outside the first area, and
wherein the event is a subject of conversation in the communication room.
(B-1)
An information processing system including a control unit that performs
control to dispose, in a virtual space, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, the event being a subject of conversation in a communication room,
in a case where a position of an avatar operated by a user in the virtual space is included in a first area that is a communication point associated with the communication room in which communication between users is performed, control to transmit information regarding selection of entry to the communication room to a terminal used by the user, and
in a case where a position of the avatar in the virtual space is included in a second area located outside the first area, control to transmit information regarding the communication to the terminal.
(B-2)
The information processing system according to Item (B-1), in which the display object includes a display object that displays a specific scene of an event taking place in a real space.
(B-3)
The information processing system according to Item (B-2), in which the display object includes a display object that displays a highlighted video of a game played in a real space.
(B-4)
The information processing system according to Item (B-1), in which the display object includes a display object that displays information regarding a team that a specific user group supports in the event.
(B-5)
The information processing system according to Item (B-4), in which
the display object includes a display object related to a team that a specific user group supports in the event, and
the information regarding selection of entry is transmitted to a terminal used by a user belonging to the specific user group.
(B-6)
The information processing system according to any one of Items (B-1) to (B-5), in which the control unit transmits connection information for acquiring information by one or more different communication means used in the communication room to a terminal used by the user in a case where a request for entry of a user into the communication room based on a user input to the information regarding selection of entry is received from the terminal.
(B-7)
The information processing device according to Item (B-6), in which the information by the communication means includes information regarding an audio, a character, or an image transmitted from a terminal of each of users who are in the communication room.
(B-8)
The information processing system according to any one of Items (B-1) to (B-7), in which the information regarding selection of entry to the communication room includes information for connecting to a server that generates the communication room.
(B-9)
The information processing system according to Item (B-8), in which the server performs control to transmit information regarding users who are in the communication room to a terminal of each user.
(B-10)
The information processing system according to any one of Items (B-1) to (B-9), in which the control unit performs control to transmit information regarding a distribution content viewed in the communication room to the terminal of the user.
(B-11)
The information processing system according to Item (B-10), in which the information regarding the distribution content includes information specifying the distribution content and information for connecting to a content distribution server to which the distribution content is distributed.
(B-12)
The information processing system according to any one of Items (B-1) to (B-11), in which the control unit receives information regarding display of a user avatar displayed on a communication room screen displayed on the terminal of the user from the terminal of the user to transmit the information in real time to a terminal of another user who is in the communication room.
(B-13)
The information processing system according to Item (B-12), in which the information regarding the display of the user avatar includes emote information for moving the user avatar or stamp information that is an image displayed near the user avatar and indicating emotional expression.
(B-14)
The information processing system according to any one of Items (B-1) to (B-13), in a case where a user avatar corresponding to the user moves within a predetermined distance from the communication point, the control unit recognizes an intention to enter the communication room, and performs control to transmit connection information for entering the communication room to the terminal of the user.
(B-15)
The information processing system according to any one of Items (B-1) to (B-14), in which the control unit performs control to instruct the terminal of the user to transition to a communication room screen when the user enters the communication room.
(B-16)
The information processing system according to Item (B-15), in which the communication room screen displays a user avatar of a user who is in the communication room.
(B-17)
The information processing system according to Item (B-16), in which a character or a stamp image input by a user corresponding to the user avatar is displayed near the user avatar.
(B-18)
The information processing system according to any one of Items (B-15) to (B-17), in which the communication room screen displays a content.
(B-19)
An information processing method including
by a processor,
disposing, in a virtual space, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, the event being a subject of conversation in a communication room,
in a case where a position of an avatar operated by a user in the virtual space is included in a first area that is a communication point associated with the communication room in which communication between users is performed, transmitting information regarding selection of entry to the communication room to a terminal used by the user, and
in a case where a position of the avatar in the virtual space is included in a second area located outside the first area, transmitting information regarding the communication to the terminal.
(B-20)
A program causing a computer to function as a control unit that performs
control to dispose, in a virtual space, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space, the event being a subject of conversation in a communication room,
in a case where a position of an avatar operated by a user in the virtual space is included in a first area that is a communication point associated with the communication room in which communication between users is performed, control to transmit information regarding selection of entry to the communication room to a terminal used by the user, and
in a case where a position of the avatar in the virtual space is included in a second area located outside the first area, control to transmit information regarding the communication to the terminal.
(B-21)
An information processing system including a control unit that
disposes a distribution video of an event taking place in a real space in a virtual space in which a user can view the distribution video of the event, and
determines event coordinated performance experienced by a plurality of users in the virtual space according to development of an event taking place in the real space and a reaction of the plurality of users.
(B-22)
The information processing system according to Item (B-21), in which the performance is determined according to an amount of reactions of the plurality of users.
(B-23)
The information processing system according to any one of Items (B-21) to (B-22), in which the performance is determined on the basis of a reaction with a largest reaction amount among reactions of the plurality of users.
(B-24)
The information processing system according to Item (B-21), in which the performance is an emote of an avatar, and the emote is reflected in one or more avatars existing in the virtual space.
(B-25)
The information processing system according to Item (B-24), in which the emote of the avatar is reflected in an avatar in a non-operated state existing in the virtual space.
(B-26)
The information processing system according to Item (B-24), in which the emote of the avatar is reflected in an NPC existing in the virtual space.
(B-27)
The information processing system according to any one of Items (B-21) to (B-26), in which the reactions of the plurality of users include at least one of reaction information regarding a user who operates an avatar existing in one virtual space, reaction information regarding a user who operates an avatar existing in a plurality of virtual spaces, or reaction information regarding a user who is experiencing an event in a real space.
20 Virtual space system
200 (200a, 200b, 200c, ...) Virtual space management server
210 Communication unit
220 Control unit
221 Virtual space management unit
222 Content management unit
223 User management unit
230 Storage unit
10 Client terminal
110 Communication unit
120 Control unit
121 Virtual space processing unit
122 Display control unit
130 Operation input unit
140 Sensor
150 Display unit
160 Audio output unit
170 Storage unit
30 backend system
301 User information management server
302 Hosting server
303 Matching server
304 Text chat server
305 Voice chat server
306 Content distribution server
307 talking table server
308 Event coordination server
309 Trading management server
310 Data analysis server

Claims (20)

  1. An information processing apparatus comprising:
    circuitry configured to:
    provide information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed, wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space; and
    provide information regarding the communication based on the avatar entering a second area located outside the first area,
    wherein the event is a subject of conversation in the communication room.
  2. The information processing apparatus according to claim 1,
    wherein the display object includes information regarding the event taking place in the real space or the virtual space.
  3. The information processing apparatus according to claim 2, wherein the display object includes information regarding a team that a specific user group supports in the event.
  4. The information processing apparatus according to claim 2,
    wherein the display object includes a specific scene of the event.
  5. The information processing apparatus according to claim 4,
    wherein the specific scene of the event includes a highlighted video of the event.
  6. The information processing apparatus according to claim 1,
    wherein the circuitry is further configured to:
    receive a selection of the information regarding selection of entry to the communication room from the user to enter the communication room; and
    provide, based on the selection from the user to enter the communication room, connection information corresponding to the communication room to the user.
  7. The information processing apparatus according to claim 1,
    wherein the selection from the user regarding the selection of entry to the communication room includes to enter the communication room as a speaker and to enter the communication room as an audience.
  8. The information processing apparatus according to claim 1,
    wherein the information regarding the communication includes audio information in the communication room, and
    wherein the circuitry is further configured to increase a volume of the audio information based on a decrease in distance between the avatar and a center of the virtual object.
  9. The information processing apparatus according to claim 1,wherein the circuitry is further configured to generate the communication room including content viewed in the communication room by the user.
  10. The information processing apparatus according to claim 9,
    wherein the content includes news related to a team participating in the event.
  11. The information processing apparatus according to claim 9,
    wherein the content includes a video having a number of views being greater than a predetermined number.
  12. The information processing apparatus according to claim 9,
    wherein the content includes a video including highlights of the event.
  13. The information processing apparatus according to claim 9,
    wherein the content includes a chat text window.
  14. The information processing apparatus according to claim 9,
    wherein the content includes a picked-up topic window.
  15. The information processing apparatus according to claim 1,
    wherein the circuitry is further configured to perform bidirectional communication with the user based on the user entering the communication room.
  16. The information processing apparatus according to claim 1,
    wherein the circuitry is further configured to
    generate the communication room including an emote icon for displaying an emote menu screen; and
    initiate display of an emote in the communication room based on a selection from the emote menu screen by the user.
  17. The information processing apparatus according to claim 1,
    wherein the circuitry is further configured to
    generate the communication room including a stamp icon for displaying a stamp menu screen; and
    initiate display of a stamp in the communication room based a selection from the stamp menu screen by the user.
  18. An information processing method comprising:
    providing information regarding selection of entry to a communication room to a user based on an avatar operated by the user entering a first area that is a communication point where communication between users is performed,
    wherein the communication room is related to the communication point and includes a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space; and
    providing information regarding the communication based on the avatar entering a second area located outside the first area,
    wherein the event is a subject of conversation in the communication room.
  19. An information processing apparatus comprising:
    circuitry configured to:
    provide, in a communication room related to a communication point where communication between users is performed, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space,
    wherein information regarding selection of entry to a communication room is provided to a user based on an avatar operated by the user entering a first area that is the communication point where communication between users is performed,
    wherein information regarding the communication is provided based on the avatar entering a second area located outside the first area, and
    wherein the event is a subject of conversation in the communication room.
  20. An information processing method comprising:
    providing, in a communication room related to a communication point where communication between users is performed, a virtual object including a display object that displays information regarding an event taking place in a real space or a virtual space,
    wherein information regarding selection of entry to a communication room is provided to a user based on an avatar operated by the user entering a first area that is the communication point where communication between users is performed,
    wherein information regarding the communication is provided based on the avatar entering a second area located outside the first area, and
    wherein the event is a subject of conversation in the communication room.

PCT/JP2023/034046 2022-11-11 2023-09-20 Information processing system, information processing method, and program for communication points regarding events WO2024101001A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022181346A JP2024070696A (en) 2022-11-11 2022-11-11 Information processing system, information processing method, and program
JP2022-181346 2022-11-11

Publications (1)

Publication Number Publication Date
WO2024101001A1 true WO2024101001A1 (en) 2024-05-16

Family

ID=88297167

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/034046 WO2024101001A1 (en) 2022-11-11 2023-09-20 Information processing system, information processing method, and program for communication points regarding events

Country Status (2)

Country Link
JP (1) JP2024070696A (en)
WO (1) WO2024101001A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008107895A (en) * 2006-10-23 2008-05-08 Nomura Research Institute Ltd Virtual space providing server, virtual space providing system, and computer program
WO2009146130A2 (en) * 2008-04-05 2009-12-03 Social Communications Company Shared virtual area communication environment based apparatus and methods
WO2018067508A1 (en) * 2016-10-04 2018-04-12 Facebook, Inc. Controls and interfaces for user interactions in virtual spaces
WO2021207156A1 (en) * 2020-04-06 2021-10-14 Eingot Llc Integration of remote audio into a performance venue
US11159766B2 (en) * 2019-09-16 2021-10-26 Qualcomm Incorporated Placement of virtual content in environments with a plurality of physical participants
WO2022086954A1 (en) * 2020-10-19 2022-04-28 Sophya Inc. Methods and systems for simulating in-person interactions in virtual environments
WO2022170222A1 (en) * 2021-02-08 2022-08-11 Multinarity Ltd Content sharing in extended reality

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008107895A (en) * 2006-10-23 2008-05-08 Nomura Research Institute Ltd Virtual space providing server, virtual space providing system, and computer program
WO2009146130A2 (en) * 2008-04-05 2009-12-03 Social Communications Company Shared virtual area communication environment based apparatus and methods
WO2018067508A1 (en) * 2016-10-04 2018-04-12 Facebook, Inc. Controls and interfaces for user interactions in virtual spaces
US11159766B2 (en) * 2019-09-16 2021-10-26 Qualcomm Incorporated Placement of virtual content in environments with a plurality of physical participants
WO2021207156A1 (en) * 2020-04-06 2021-10-14 Eingot Llc Integration of remote audio into a performance venue
WO2022086954A1 (en) * 2020-10-19 2022-04-28 Sophya Inc. Methods and systems for simulating in-person interactions in virtual environments
WO2022170222A1 (en) * 2021-02-08 2022-08-11 Multinarity Ltd Content sharing in extended reality

Also Published As

Publication number Publication date
JP2024070696A (en) 2024-05-23

Similar Documents

Publication Publication Date Title
JP6700463B2 (en) Filtering and parental control methods for limiting visual effects on head mounted displays
US11436803B2 (en) Insertion of VR spectator in live video of a live event
CN111201069B (en) Spectator view of interactive game world presented in live event held in real world site
US11571620B2 (en) Using HMD camera touch button to render images of a user captured during game play
CN109069934B (en) Audience view tracking of virtual reality environment (VR) users in a VR
US10380798B2 (en) Projectile object rendering for a virtual reality spectator
US20190073830A1 (en) Program for providing virtual space by head mount display, method and information processing apparatus for executing the program
US7647560B2 (en) User interface for multi-sensory emoticons in a communication system
US20190105568A1 (en) Sound localization in an augmented reality view of a live event held in a real-world venue
KR20040104753A (en) On-line gaming spectator
CN112717423B (en) Live broadcast method, device, equipment and storage medium for game match
JP2019141162A (en) Computer system
JP2020044139A (en) Game program, game method, and information processor
JP6688378B1 (en) Content distribution system, distribution device, reception device, and program
WO2024101001A1 (en) Information processing system, information processing method, and program for communication points regarding events
JP6776425B1 (en) Programs, methods, and delivery terminals
WO2022137343A1 (en) Information processing method, computer-readable medium, and information processing device
WO2024114518A1 (en) Display control method, display control apparatus, and electronic device
WO2022113335A1 (en) Method, computer-readable medium, and information processing device
WO2022113330A1 (en) Method, computer-readable medium, and information processing device
WO2022137523A1 (en) Game method, computer-readable medium, and information processing device
JP7495558B1 (en) VIRTUAL SPACE CONTENT DELIVERY SYSTEM, VIRTUAL SPACE CONTENT DELIVERY PROGRAM, AND VIRTUAL SPACE CONTENT DELIVERY METHOD
US20220395755A1 (en) Method for broadcasting gameplay and method for joining game
WO2022137377A1 (en) Information processing method, computer-readable medium, computer system, and information processing device
EP4306192A1 (en) Information processing device, information processing terminal, information processing method, and program