WO2022113335A1 - Procédé, support lisible par ordinateur et dispositif de traitement d'informations - Google Patents

Procédé, support lisible par ordinateur et dispositif de traitement d'informations Download PDF

Info

Publication number
WO2022113335A1
WO2022113335A1 PCT/JP2020/044471 JP2020044471W WO2022113335A1 WO 2022113335 A1 WO2022113335 A1 WO 2022113335A1 JP 2020044471 W JP2020044471 W JP 2020044471W WO 2022113335 A1 WO2022113335 A1 WO 2022113335A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
game
information
virtual space
virtual
Prior art date
Application number
PCT/JP2020/044471
Other languages
English (en)
Japanese (ja)
Inventor
功淳 馬場
潤哉 福重
俊 菊地
Original Assignee
株式会社コロプラ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コロプラ filed Critical 株式会社コロプラ
Priority to JP2022564992A priority Critical patent/JPWO2022113335A1/ja
Priority to PCT/JP2020/044471 priority patent/WO2022113335A1/fr
Publication of WO2022113335A1 publication Critical patent/WO2022113335A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • A63F13/497Partially or entirely replaying previous game actions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress

Definitions

  • This disclosure relates to methods, computer-readable media, and information processing devices.
  • Non-Patent Document 1 discloses a romance simulation game whose main purpose is to virtually deepen friendship with a girl character. The user selects the most suitable action for the character from the presented options, and the story progresses by repeating the reaction of the character to the action.
  • Non-Patent Document 1 a character response pattern is prepared in advance. Then, according to the input operation of the user, the response of the character is determined from the response pattern and output, and the game progresses. Therefore, the variation of the character's movement does not extend beyond the contents of the data prepared in advance. Therefore, there is a problem that the user cannot feel the reality as if the character is in the real world with respect to the relationship with the character, and eventually gets tired of it. Generally, in a game developed with the intention of letting the user play for a long time, it is important how to deal with the problem that the user gets tired of the game. Games are always required to provide compelling content that motivates users to play. For example, if a character appearing in a game has a high sense of reality, the user can easily immerse himself in the world of the game and find interest in the relationship with the character.
  • One aspect of the present disclosure is intended to enhance the immersive feeling of the game in the world and to improve the interest of the game.
  • One aspect of the present invention is a first step group executed by a viewing terminal for a viewer to view real-time distribution performed by a distributor, and is a distribution terminal for the distributor to perform real-time distribution and viewing.
  • a step of arranging an avatar object corresponding to a distributor and a virtual camera in a virtual space shared between a plurality of viewing terminals including a terminal, and a viewing area of the virtual camera during real-time distribution are shown.
  • the game in one state is played by the movement of the body of the distributor and the operation of the user.
  • the game processing in each state includes a step of performing a game process to be advanced based on the above, and a step of receiving motion information indicating the motion of the avatar object generated by the distribution terminal by detecting the motion of the distributor's body.
  • the step of updating the field image so that the avatar object includes performing the motion indicated by the motion information in response to the receipt of the motion information, and the motion in response to the motion information being received.
  • the step of determining whether or not the movement indicated by the information is a movement for transitioning the state, and when the movement indicated by the motion information is a movement for transitioning, the progress of the game part is changed from the current state to another.
  • a first step group including a step of transitioning to the state of A step of receiving recorded motion information from an external device, which is executed after the group is executed, based on the step and the motion information, wherein the motion information includes motion data and voice data input by the distributor. It is a method including a second step group including a step of performing reproduction of the delivered content realized by the first step group by operating an avatar object.
  • Another aspect of the present invention is a method executed by a user terminal for the user to view real-time distribution performed by the distributor, the distribution terminal for the distributor to perform real-time distribution, and a plurality of including the user terminal.
  • the step of arranging the avatar object corresponding to the distributor and the virtual camera, and the field image showing the viewing area of the virtual camera are displayed during the real-time distribution.
  • the game in one state is advanced based on the movement of the distributor's body and the operation of the user.
  • the step of performing the game processing and the step of executing the game processing in each state are the step of receiving the operation information indicating the operation of the avatar object generated by the distribution terminal by detecting the movement of the distributor's body, and the operation information.
  • the motion information indicates a step of updating the field image so that the avatar object includes performing the motion indicated by the motion information.
  • a method that includes, by operating, a step of performing playback of the delivered content.
  • the computer is a computer-readable medium containing computer-executable instructions, and when the computer-executable instructions are executed, the user terminal is made to execute the second step group described above. It is a readable medium.
  • it is a computer-readable medium containing computer-executable instructions, and when the computer-executable instructions are executed, the user terminal is made to perform the steps included in the above method. It is a medium.
  • a computer system including a first information processing apparatus and a second information processing apparatus, wherein the first information processing apparatus comprises the above-mentioned first step group.
  • the second information processing apparatus includes a memory for storing the first program including the first information processing apparatus and a processor for controlling the operation of the first information processing apparatus by executing the first information processing apparatus, and the second information processing apparatus includes the above-mentioned second information processing apparatus.
  • the information processing apparatus includes a memory for storing a program including the above steps, and a processor for controlling the operation of the information processing apparatus by executing the program. It is an information processing device.
  • it has the effect of increasing the immersive feeling of the game in the world and improving the interest of the game.
  • FIG. 1 It is a figure which shows the outline of the structure of the HMD system according to a certain embodiment. It is a block diagram which shows an example of the hardware composition of the computer according to a certain embodiment. It is a figure which conceptually represents the uvw field coordinate system set in the HMD according to a certain embodiment. It is a figure which conceptually represents one aspect which expresses a virtual space according to a certain embodiment. It is a figure which showed the head of the user who wears an HMD according to a certain embodiment from the top. It is a figure which shows the YZ cross section which looked at the visual field area from the X direction in the virtual space. It is a figure which shows the XZ cross section which looked at the visual field area from the Y direction in the virtual space.
  • FIG. 3 is a block diagram showing a detailed configuration of a computer module according to an embodiment. It is a figure which shows the outline of the structure of the game system according to a certain embodiment. It is a figure which shows the hardware configuration of the user terminal according to a certain embodiment.
  • FIG. 3 is a block diagram showing a functional configuration of a user terminal, a server, and a character control device included in a game system according to an embodiment. It is a figure which shows the detailed structure of the virtual space control part included in the user terminal which follows a certain embodiment. It is a figure which shows the detailed structure of the virtual space control part included in the character control device according to a certain embodiment.
  • A is a diagram showing an example of a first virtual space realized in a character control device according to a certain embodiment.
  • B is a diagram showing an example of a field image displayed on the character control device.
  • A) is a diagram showing an example of a first virtual space realized in a user terminal according to a certain embodiment.
  • B is a diagram showing an example of a visual field image displayed on the user terminal. It is a flowchart which shows the flow of processing in the 1st mini-game executed by the game system which follows a certain embodiment.
  • (A) is a perspective view showing an example of a first virtual space realized at the time of progress of the first mini-game in a character control device according to a certain embodiment.
  • (B) is a top view showing an example of a first virtual space realized in a user terminal in synchronization with the first virtual space.
  • (A) is a figure which shows an example of the field of view image displayed at the time of the 1st mini-game progress in the character control apparatus according to a certain embodiment.
  • (B) is a diagram showing an example of a game screen displayed when the first mini-game is progressing on the user terminal.
  • (A) is a figure which shows an example of the field of view image displayed at the time of the 1st mini-game progress in the character control apparatus according to a certain embodiment.
  • (B) is a diagram showing an example of a game screen displayed when the first mini-game is progressing on the user terminal.
  • (A) is a figure which shows an example of the field of view image displayed at the time of the 1st mini-game progress in the character control apparatus according to a certain embodiment.
  • (B) is a diagram showing an example of a game screen displayed when the first mini-game is progressing on the user terminal.
  • (A) is a figure which shows an example of the field of view image displayed at the time of the 1st mini-game progress in the character control apparatus according to a certain embodiment.
  • (B) is a diagram showing an example of a game screen displayed when the first mini-game is progressing on the user terminal.
  • FIG. 1 is a perspective view showing an example of a first virtual space realized at the time of progress of the second mini-game in a character control device according to a certain embodiment.
  • FIG. 1 is a perspective view showing an example of a first virtual space realized at the time of progress of the second mini-game in a character control device according to a certain embodiment.
  • FIG. 1 is a figure which shows an example of the field of view image which is displayed at the time of the 2nd mini-game progress in the character control apparatus which follows a certain embodiment.
  • (B) is a diagram showing an example of a game screen displayed on the user terminal when the second mini-game is in progress.
  • FIG. 1 It is a figure which shows an example of the game screen which is displayed when the 2nd mini-game progresses in a user terminal.
  • A is a figure which shows an example of the field of view image which is displayed at the time of the 2nd mini-game progress in the character control apparatus which follows a certain embodiment.
  • B is a diagram showing an example of a game screen displayed on the user terminal when the second mini-game is in progress. It is a figure which shows an example of the game screen which is displayed when the 2nd mini-game progresses in a user terminal.
  • (A) is a figure which shows an example of the field of view image which is displayed at the time of the 2nd mini-game progress in the character control apparatus which follows a certain embodiment.
  • (B) is a diagram showing an example of a game screen displayed on the user terminal when the second mini-game is in progress. It is a flowchart which shows the flow of processing in the live distribution part executed by the game system according to a certain embodiment.
  • (A) is a figure which shows an example of the field of view image displayed when moving to the 2nd virtual space in the character control apparatus according to a certain embodiment.
  • (B) is a diagram showing an example of a game screen displayed when moving to a second virtual space on a user terminal.
  • (A) is a figure which shows an example of the 1st virtual space after leaving the avatar object according to a certain embodiment.
  • (B) is a diagram showing an example of a second virtual space after the appearance of the avatar object.
  • FIG. 1 is a diagram showing an outline of the configuration of the HMD system 100 according to the present embodiment.
  • the HMD system 100 is provided as a home system or a business system.
  • the HMD system 100 includes a server 600, HMD sets 110A, 110B, 110C, 110D, an external device 700, and a network 2.
  • Each of the HMD sets 110A, 110B, 110C, and 110D is configured to be able to communicate with the server 600 and the external device 700 via the network 2.
  • the HMD set 110A, 110B, 110C, 110D are collectively referred to as the HMD set 110.
  • the number of HMD sets 110 constituting the HMD system 100 is not limited to four, and may be three or less or five or more.
  • the HMD set 110 includes an HMD 120, a computer 200, an HMD sensor 410, a display 430, and a controller 300.
  • the HMD 120 includes a monitor 130, a gaze sensor 140, a first camera 150, a second camera 160, a microphone 170, and a speaker 180.
  • the controller 300 may include a motion sensor 420.
  • the computer 200 can be connected to the Internet or other network 2, and can communicate with the server 600 or other computer connected to the network 2. Examples of other computers include computers of other HMD sets 110 and external devices 700.
  • the HMD 120 may include a sensor 190 instead of the HMD sensor 410.
  • the HMD 120 is attached to the head of the user 5 and can provide the virtual space to the user 5 during operation. More specifically, the HMD 120 displays an image for the right eye and an image for the left eye on the monitor 130, respectively. When each eye of the user 5 visually recognizes the respective image, the user 5 can recognize the image as a three-dimensional image based on the parallax of both eyes.
  • the HMD 120 may include either a so-called head-mounted display provided with a monitor and a head-mounted device to which a terminal having a monitor such as a smartphone can be attached.
  • the monitor 130 is realized as, for example, a non-transparent display device.
  • the monitor 130 is arranged in the body of the HMD 120 so as to be located in front of both eyes of the user 5. Therefore, the user 5 can immerse himself in the virtual space when he / she visually recognizes the three-dimensional image displayed on the monitor 130.
  • the virtual space includes, for example, a background, a user-operable object, and a user-selectable menu image.
  • the monitor 130 can be realized as a liquid crystal monitor or an organic EL (Electro Luminescence) monitor included in a so-called smartphone or other information display terminal.
  • the monitor 130 can be realized as a transmissive display device.
  • the HMD 120 may be an open type such as a glasses type instead of a closed type that covers the eyes of the user 5 as shown in FIG.
  • the transmissive monitor 130 may be temporarily configured as a non-transparent display device by adjusting its transmittance.
  • the monitor 130 may include a configuration that simultaneously displays a part of the image constituting the virtual space and the real space.
  • the monitor 130 may display an image of the real space taken by the camera mounted on the HMD 120, or may make the real space visible by setting a part of the transmittance to be high.
  • the monitor 130 may include a sub-monitor for displaying an image for the right eye and a sub-monitor for displaying an image for the left eye.
  • the monitor 130 may be configured to display the image for the right eye and the image for the left eye as a unit.
  • the monitor 130 includes a high speed shutter. The high-speed shutter operates so that the image for the right eye and the image for the left eye can be alternately displayed so that the image is recognized by only one of the eyes.
  • the HMD 120 includes a plurality of light sources (not shown). Each light source is realized by, for example, an LED (Light Emitting Diode) that emits infrared rays.
  • the HMD sensor 410 has a position tracking function for detecting the movement of the HMD 120. More specifically, the HMD sensor 410 reads a plurality of infrared rays emitted by the HMD 120 and detects the position and inclination of the HMD 120 in the real space.
  • the HMD sensor 410 may be realized by a camera.
  • the HMD sensor 410 can detect the position and tilt of the HMD 120 by executing the image analysis process using the image information of the HMD 120 output from the camera.
  • the HMD 120 may include a sensor 190 as a position detector in place of the HMD sensor 410 or in addition to the HMD sensor 410.
  • the HMD 120 can detect the position and tilt of the HMD 120 itself using the sensor 190.
  • the sensor 190 is an angular velocity sensor, a geomagnetic sensor, or an acceleration sensor
  • the HMD 120 may use any of these sensors instead of the HMD sensor 410 to detect its position and tilt.
  • the angular velocity sensor detects the angular velocity around the three axes of the HMD 120 in real space over time.
  • the HMD 120 calculates the temporal change of the angle around the three axes of the HMD 120 based on each angular velocity, and further calculates the inclination of the HMD 120 based on the temporal change of the angle.
  • the gaze sensor 140 detects the direction in which the line of sight of the user 5's right eye and left eye is directed. That is, the gaze sensor 140 detects the line of sight of the user 5. Detection of the direction of the line of sight is realized, for example, by a known eye tracking function.
  • the gaze sensor 140 is realized by a sensor having the eye tracking function.
  • the gaze sensor 140 preferably includes a sensor for the right eye and a sensor for the left eye.
  • the gaze sensor 140 may be, for example, a sensor that irradiates the right eye and the left eye of the user 5 with infrared light and detects the angle of rotation of each eyeball by receiving the reflected light from the cornea and the iris with respect to the irradiation light. ..
  • the gaze sensor 140 can detect the line of sight of the user 5 based on each detected rotation angle.
  • the first camera 150 captures the lower part of the face of the user 5. More specifically, the first camera 150 captures the nose, mouth, and the like of the user 5.
  • the second camera 160 captures the eyes, eyebrows, and the like of the user 5.
  • the housing on the user 5 side of the HMD 120 is defined as the inside of the HMD 120, and the housing on the opposite side of the user 5 of the HMD 120 is defined as the outside of the HMD 120.
  • the first camera 150 may be located outside the HMD 120 and the second camera 160 may be located inside the HMD 120.
  • the images generated by the first camera 150 and the second camera 160 are input to the computer 200.
  • the first camera 150 and the second camera 160 may be realized as one camera, and the face of the user 5 may be photographed by this one camera.
  • the microphone 170 converts the utterance of the user 5 into an audio signal (electrical signal) and outputs it to the computer 200.
  • the speaker 180 converts the voice signal into voice and outputs it to the user 5.
  • the HMD 120 may include earphones instead of the speaker 180.
  • the controller 300 is connected to the computer 200 by wire or wirelessly.
  • the controller 300 receives an instruction input from the user 5 to the computer 200.
  • the controller 300 is configured to be grippable by the user 5.
  • the controller 300 is configured to be wearable on a part of the user 5's body or clothing.
  • the controller 300 may be configured to output at least one of vibration, sound, and light based on the signal transmitted from the computer 200.
  • the controller 300 receives from the user 5 an operation for controlling the position and movement of the object arranged in the virtual space.
  • the controller 300 includes a plurality of light sources. Each light source is realized, for example, by an LED that emits infrared rays.
  • the HMD sensor 410 has a position tracking function. In this case, the HMD sensor 410 reads a plurality of infrared rays emitted by the controller 300 and detects the position and inclination of the controller 300 in the real space.
  • the HMD sensor 410 may be implemented by a camera. In this case, the HMD sensor 410 can detect the position and tilt of the controller 300 by executing the image analysis process using the image information of the controller 300 output from the camera.
  • the motion sensor 420 is attached to the user 5's hand at a certain stage and detects the movement of the user 5's hand. For example, the motion sensor 420 detects the rotation speed, the number of rotations, and the like of the hand. The detected signal is sent to the computer 200.
  • the motion sensor 420 is provided in the controller 300, for example.
  • the motion sensor 420 is provided in, for example, a controller 300 configured to be grippable by the user 5.
  • the controller 300 is attached to something that does not easily fly by being attached to the user 5's hand, such as a glove type.
  • a sensor not attached to the user 5 may detect the movement of the user 5's hand.
  • the signal of the camera that captures the user 5 may be input to the computer 200 as a signal indicating the operation of the user 5.
  • the motion sensor 420 and the computer 200 are wirelessly connected to each other.
  • the communication mode is not particularly limited, and for example, Bluetooth (registered trademark) or other known communication method is used.
  • the display 430 displays an image similar to the image displayed on the monitor 130. As a result, users other than the user 5 wearing the HMD 120 can view the same image as the user 5.
  • the image displayed on the display 430 does not have to be a three-dimensional image, and may be an image for the right eye or an image for the left eye. Examples of the display 430 include a liquid crystal display and an organic EL monitor.
  • the server 600 may send a program to the computer 200.
  • the server 600 may communicate with another computer 200 to provide virtual reality to the HMD 120 used by another user.
  • each computer 200 communicates a signal based on the operation of each user with another computer 200 via a server 600, and a plurality of users are used in the same virtual space. Allows users to enjoy a common game.
  • Each computer 200 may communicate a signal based on the operation of each user with another computer 200 without going through the server 600.
  • the external device 700 may be any device as long as it can communicate with the computer 200.
  • the external device 700 may be, for example, a device capable of communicating with the computer 200 via the network 2, or a device capable of directly communicating with the computer 200 by short-range wireless communication or a wired connection.
  • Examples of the external device 700 include, but are not limited to, smart devices, PCs (Personal Computers), and peripheral devices of the computer 200.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of the computer 200 according to the present embodiment.
  • the computer 200 includes a processor 210, a memory 220, a storage 230, an input / output interface 240, and a communication interface 250 as main components. Each component is connected to bus 260, respectively.
  • the processor 210 executes a series of instructions included in the program stored in the memory 220 or the storage 230 based on the signal given to the computer 200 or when a predetermined condition is satisfied.
  • the processor 210 is realized as a CPU (Central Processing Unit), GPU (Graphics Processing Unit), MPU (Micro Processor Unit), FPGA (Field-Programmable Gate Array) or other device.
  • Memory 220 temporarily stores programs and data.
  • the program is loaded from storage 230, for example.
  • the data includes data input to the computer 200 and data generated by the processor 210.
  • the memory 220 is realized as a RAM (RandomAccessMemory) or other volatile memory.
  • Storage 230 permanently retains programs and data.
  • the storage 230 is realized as, for example, a ROM (Read-Only Memory), a hard disk device, a flash memory, or other non-volatile storage device.
  • the program stored in the storage 230 includes a program for providing a virtual space in the HMD system 100, a simulation program, a game program, a user authentication program, and a program for realizing communication with another computer 200.
  • the data stored in the storage 230 includes data, objects, and the like for defining the virtual space.
  • the storage 230 may be realized as a detachable storage device such as a memory card.
  • a configuration using programs and data stored in an external storage device may be used instead of the storage 230 built into the computer 200. With such a configuration, it becomes possible to collectively update programs and data in a situation where a plurality of HMD systems 100 are used, for example, in an amusement facility.
  • the input / output interface 240 communicates signals with the HMD 120, the HMD sensor 410, the motion sensor 420, and the display 430.
  • the monitor 130, the gaze sensor 140, the first camera 150, the second camera 160, the microphone 170, and the speaker 180 included in the HMD 120 may communicate with the computer 200 via the input / output interface 240 of the HMD 120.
  • the input / output interface 240 is realized by using USB (Universal Serial Bus), DVI (Digital Visual Interface), HDMI (registered trademark) (High-Definition Multimedia Interface) and other terminals.
  • the input / output interface 240 is not limited to the above.
  • the input / output interface 240 may further communicate with the controller 300.
  • the input / output interface 240 receives inputs of signals output from the controller 300 and the motion sensor 420.
  • the input / output interface 240 sends an instruction output from the processor 210 to the controller 300.
  • the command instructs the controller 300 to vibrate, output voice, emit light, and the like.
  • the controller 300 executes either vibration, voice output, or light emission in response to the command.
  • the communication interface 250 is connected to the network 2 and communicates with another computer (for example, the server 600) connected to the network 2.
  • the communication interface 250 is realized as, for example, a LAN (Local Area Network) or other wired communication interface, or a WiFi (Wireless Fidelity), Bluetooth (registered trademark), NFC (Near Field Communication) or other wireless communication interface. Will be done.
  • the communication interface 250 is not limited to the above.
  • the processor 210 accesses the storage 230, loads one or more programs stored in the storage 230 into the memory 220, and executes a series of instructions included in the program.
  • the one or more programs may include an operating system of a computer 200, an application program for providing a virtual space, game software that can be executed in the virtual space, and the like.
  • the processor 210 sends a signal to the HMD 120 to provide virtual space via the input / output interface 240.
  • the HMD 120 displays an image on the monitor 130 based on the signal.
  • the computer 200 is configured to be provided outside the HMD 120, but in another aspect, the computer 200 may be built in the HMD 120.
  • a portable information communication terminal for example, a smartphone
  • a monitor 130 may function as a computer 200.
  • the computer 200 may have a configuration commonly used for a plurality of HMD 120s. According to such a configuration, for example, the same virtual space can be provided to a plurality of users, so that each user can enjoy the same application as other users in the same virtual space.
  • a real coordinate system which is a coordinate system in the real space
  • the real coordinate system has three reference directions (axises) that are parallel to the vertical direction in the real space, the horizontal direction orthogonal to the vertical direction, and the front-back direction orthogonal to both the vertical direction and the horizontal direction.
  • the horizontal direction, the vertical direction (vertical direction), and the front-back direction in the real coordinate system are defined as the x-axis, the y-axis, and the z-axis, respectively. More specifically, in the real coordinate system, the x-axis is parallel to the horizontal direction of the real space.
  • the y-axis is parallel to the vertical direction in real space.
  • the z-axis is parallel to the front-back direction of the real space.
  • the HMD sensor 410 includes an infrared sensor.
  • the infrared sensor detects the infrared rays emitted from each light source of the HMD 120, the presence of the HMD 120 is detected.
  • the HMD sensor 410 further detects the position and inclination (orientation) of the HMD 120 in the real space according to the movement of the user 5 wearing the HMD 120 based on the value of each point (each coordinate value in the real coordinate system). do. More specifically, the HMD sensor 410 can detect a temporal change in the position and inclination of the HMD 120 by using each value detected over time.
  • Each inclination of the HMD 120 detected by the HMD sensor 410 corresponds to each inclination of the HMD 120 around three axes in the real coordinate system.
  • the HMD sensor 410 sets the uvw field coordinate system to the HMD 120 based on the tilt of the HMD 120 in the real coordinate system.
  • the uvw field-of-view coordinate system set in the HMD 120 corresponds to the viewpoint coordinate system when the user 5 wearing the HMD 120 sees an object in the virtual space.
  • FIG. 3 is a diagram conceptually representing the uvw field coordinate system set in the HMD 120 according to an embodiment.
  • the HMD sensor 410 detects the position and tilt of the HMD 120 in the real coordinate system when the HMD 120 is activated.
  • Processor 210 sets the uvw field coordinate system to HMD 120 based on the detected values.
  • the HMD 120 sets a three-dimensional uvw field coordinate system centered on the head of the user 5 wearing the HMD 120 (origin). More specifically, the HMD 120 has horizontal, vertical, and anteroposterior directions (x-axis, y-axis, z-axis) that define the real coordinate system, respectively, by the inclination around each axis of the HMD 120 in the real coordinate system.
  • the three directions newly obtained by tilting each around the axis are set as the pitch axis (u axis), the yaw axis (v axis), and the roll axis (w axis) of the uvw field coordinate system in the HMD 120.
  • the processor 210 sets the uvw field coordinate system parallel to the real coordinate system to the HMD 120.
  • the horizontal direction (x axis), the vertical direction (y axis), and the anteroposterior direction (z axis) in the real coordinate system are the pitch axis (u axis) and the yaw axis (v axis) of the uvw field coordinate system in the HMD 120.
  • the roll axis (w axis) are the roll axis (w axis).
  • the HMD sensor 410 can detect the inclination of the HMD 120 in the set uvw field coordinate system based on the movement of the HMD 120. In this case, the HMD sensor 410 detects the pitch angle ( ⁇ u), yaw angle ( ⁇ v), and roll angle ( ⁇ w) of the HMD 120 in the uvw visual field coordinate system as the inclination of the HMD 120, respectively.
  • the pitch angle ( ⁇ u) represents the tilt angle of the HMD 120 around the pitch axis in the uvw visual field coordinate system.
  • the yaw angle ( ⁇ v) represents the tilt angle of the HMD 120 around the yaw axis in the uvw visual field coordinate system.
  • the roll angle ( ⁇ w) represents the tilt angle of the HMD 120 around the roll axis in the uvw visual field coordinate system.
  • the HMD sensor 410 sets the uvw field coordinate system in the HMD 120 after the HMD 120 has moved, to the HMD 120, based on the detected inclination of the HMD 120.
  • the relationship between the HMD 120 and the uvw field coordinate system of the HMD 120 is always constant regardless of the position and tilt of the HMD 120.
  • the position and inclination of the HMD 120 change, the position and inclination of the uvw visual field coordinate system of the HMD 120 in the real coordinate system change in conjunction with the change of the position and inclination.
  • the HMD sensor 410 is based on the infrared light intensity obtained based on the output from the infrared sensor and the relative positional relationship between the points (eg, the distance between the points), the HMD 120.
  • the position of the light in the real space may be specified as a relative position with respect to the HMD sensor 410.
  • the processor 210 may determine the origin of the uvw visual field coordinate system of the HMD 120 in real space (real coordinate system) based on the identified relative position.
  • FIG. 4 is a diagram conceptually representing one aspect of expressing the virtual space 11 according to a certain embodiment.
  • the virtual space 11 has an all-sky spherical structure that covers the entire center 12 in the 360-degree direction.
  • the celestial sphere in the upper half of the virtual space 11 is illustrated so as not to complicate the explanation.
  • Each mesh is defined in the virtual space 11.
  • the position of each mesh is predetermined as a coordinate value in the XYZ coordinate system, which is a global coordinate system defined in the virtual space 11.
  • the computer 200 associates each partial image constituting the panoramic image 13 (still image, moving image, etc.) expandable in the virtual space 11 with each corresponding mesh in the virtual space 11.
  • the virtual space 11 defines an XYZ coordinate system with the center 12 as the origin.
  • the XYZ coordinate system is, for example, parallel to the real coordinate system.
  • the horizontal direction, the vertical direction (vertical direction), and the front-back direction in the XYZ coordinate system are defined as the X-axis, the Y-axis, and the Z-axis, respectively. Therefore, the X-axis (horizontal direction) of the XYZ coordinate system is parallel to the x-axis of the real coordinate system, and the Y-axis (vertical direction) of the XYZ coordinate system is parallel to the y-axis of the real coordinate system.
  • the Z-axis (front-back direction) is parallel to the z-axis of the real coordinate system.
  • the virtual camera 14 is arranged at the center 12 of the virtual space 11 at the time of starting the HMD 120, that is, in the initial state of the HMD 120.
  • the processor 210 displays an image captured by the virtual camera 14 on the monitor 130 of the HMD 120.
  • the virtual camera 14 moves in the virtual space 11 in the same manner as the movement of the HMD 120 in the real space. Thereby, changes in the position and inclination of the HMD 120 in the real space can be similarly reproduced in the virtual space 11.
  • the virtual camera 14 is defined with the uvw field-of-view coordinate system as in the case of the HMD 120.
  • the uvw field-of-view coordinate system of the virtual camera 14 in the virtual space 11 is defined to be linked to the uvw field-of-view coordinate system of the HMD 120 in the real space (real coordinate system). Therefore, when the tilt of the HMD 120 changes, the tilt of the virtual camera 14 also changes accordingly.
  • the virtual camera 14 can also move in the virtual space 11 in conjunction with the movement of the user 5 wearing the HMD 120 in the real space.
  • the processor 210 of the computer 200 defines the field of view region 15 in the virtual space 11 based on the position and inclination (reference line of sight 16) of the virtual camera 14.
  • the visual field area 15 corresponds to an area of the virtual space 11 that is visually recognized by the user 5 wearing the HMD 120. That is, the position of the virtual camera 14 can be said to be the viewpoint of the user 5 in the virtual space 11.
  • the line of sight of the user 5 detected by the gaze sensor 140 is the direction in the viewpoint coordinate system when the user 5 visually recognizes the object.
  • the uvw field-of-view coordinate system of the HMD 120 is equal to the viewpoint coordinate system when the user 5 visually recognizes the monitor 130.
  • the uvw field-of-view coordinate system of the virtual camera 14 is linked to the uvw field-of-view coordinate system of the HMD 120. Therefore, the HMD system 100 according to a certain aspect can consider the line of sight of the user 5 detected by the gaze sensor 140 as the line of sight of the user 5 in the uvw field-of-view coordinate system of the virtual camera 14.
  • FIG. 5 is a top view of the head of the user 5 who wears the HMD 120 according to a certain embodiment.
  • the gaze sensor 140 detects each line of sight of the user 5's right eye and left eye. In a certain aspect, when the user 5 is looking near, the gaze sensor 140 detects the lines of sight R1 and L1. In another aspect, when the user 5 is looking far away, the gaze sensor 140 detects the lines of sight R2 and L2. In this case, the angle formed by the lines of sight R2 and L2 with respect to the roll axis w is smaller than the angle formed by the lines of sight R1 and L1 with respect to the roll axis w. The gaze sensor 140 transmits the detection result to the computer 200.
  • the computer 200 When the computer 200 receives the detection values of the lines of sight R1 and L1 from the gaze sensor 140 as the detection result of the line of sight, the computer 200 identifies the gaze point N1 which is the intersection of the lines of sight R1 and L1 based on the detected values. On the other hand, when the computer 200 receives the detected values of the lines of sight R2 and L2 from the gaze sensor 140, the computer 200 identifies the intersection of the lines of sight R2 and L2 as the gaze point. The computer 200 identifies the line of sight N0 of the user 5 based on the position of the identified gazing point N1.
  • the computer 200 detects, for example, the extending direction of the straight line passing through the midpoint of the straight line connecting the right eye R and the left eye L of the user 5 and the gazing point N1 as the line of sight N0.
  • the line of sight N0 is the direction in which the user 5 actually directs the line of sight with both eyes.
  • the line of sight N0 corresponds to the direction in which the user 5 actually directs the line of sight with respect to the field of view area 15.
  • the HMD system 100 may include a television broadcast receiving tuner. According to such a configuration, the HMD system 100 can display a television program in the virtual space 11.
  • the HMD system 100 may be provided with a communication circuit for connecting to the Internet or a call function for connecting to a telephone line.
  • FIG. 6 is a diagram showing a YZ cross section of the field of view region 15 viewed from the X direction in the virtual space 11.
  • FIG. 7 is a diagram showing an XZ cross section of the field of view region 15 viewed from the Y direction in the virtual space 11.
  • the field of view region 15 in the YZ cross section includes the region 18.
  • the region 18 is defined by the position of the virtual camera 14, the reference line of sight 16, and the YZ cross section of the virtual space 11.
  • the processor 210 defines a range including the polar angle ⁇ around the reference line of sight 16 in the virtual space as a region 18.
  • the field of view region 15 in the XZ cross section includes the region 19.
  • the region 19 is defined by the position of the virtual camera 14, the reference line of sight 16, and the XZ cross section of the virtual space 11.
  • the processor 210 defines a range including the azimuth angle ⁇ centered on the reference line of sight 16 in the virtual space 11 as a region 19.
  • the polar angles ⁇ and ⁇ are determined according to the position of the virtual camera 14 and the inclination (orientation) of the virtual camera 14.
  • the HMD system 100 provides the user 5 with a field of view in the virtual space 11 by displaying the field of view image 17 on the monitor 130 based on the signal from the computer 200.
  • the visual field image 17 is an image corresponding to a portion of the panoramic image 13 corresponding to the visual field region 15.
  • the virtual camera 14 also moves in conjunction with the movement.
  • the position of the visual field region 15 in the virtual space 11 changes.
  • the field-of-view image 17 displayed on the monitor 130 is updated to an image superimposed on the field-of-view area 15 in the direction in which the user 5 faces in the virtual space 11 among the panoramic images 13.
  • the user 5 can visually recognize the desired direction in the virtual space 11.
  • the inclination of the virtual camera 14 corresponds to the line of sight of the user 5 (reference line of sight 16) in the virtual space 11, and the position where the virtual camera 14 is arranged corresponds to the viewpoint of the user 5 in the virtual space 11. Therefore, by changing the position or tilt of the virtual camera 14, the image displayed on the monitor 130 is updated and the field of view of the user 5 is moved.
  • the HMD system 100 can give the user 5 a high sense of immersion in the virtual space 11.
  • the processor 210 may move the virtual camera 14 in the virtual space 11 in conjunction with the movement of the user 5 wearing the HMD 120 in the real space. In this case, the processor 210 identifies an image region (field of view region 15) projected onto the monitor 130 of the HMD 120 based on the position and tilt of the virtual camera 14 in the virtual space 11.
  • the virtual camera 14 may include two virtual cameras, a virtual camera for providing an image for the right eye and a virtual camera for providing an image for the left eye. Appropriate parallax is set in the two virtual cameras so that the user 5 can recognize the three-dimensional virtual space 11.
  • the virtual camera 14 may be realized by one virtual camera. In this case, an image for the right eye and an image for the left eye may be generated from the image obtained by one virtual camera.
  • the virtual camera 14 includes two virtual cameras, and the roll axis (w) generated by synthesizing the roll axes of the two virtual cameras is adapted to the roll axis (w) of the HMD 120. Assuming that it is configured as such, the technical idea according to the present disclosure will be exemplified.
  • FIG. 8A is a diagram showing a schematic configuration of a controller 300 according to an embodiment.
  • the controller 300 may include a right controller 300R and a left controller (not shown).
  • the right controller 300R is operated by the right hand of the user 5.
  • the left controller is operated by the left hand of the user 5.
  • the right controller 300R and the left controller are symmetrically configured as separate devices. Therefore, the user 5 can freely move the right hand holding the right controller 300R and the left hand holding the left controller.
  • the controller 300 may be an integrated controller that accepts operations of both hands.
  • the right controller 300R will be described.
  • the right controller 300R includes a grip 310, a frame 320, and a top surface 330.
  • the grip 310 is configured to be gripped by the right hand of the user 5.
  • the grip 310 may be held by the palm of the user 5's right hand and three fingers (middle finger, ring finger, little finger).
  • the grip 310 includes buttons 340 and 350 and a motion sensor 420.
  • the button 340 is arranged on the side surface of the grip 310 and accepts an operation by the middle finger of the right hand.
  • the button 350 is arranged on the front surface of the grip 310 and accepts an operation by the index finger of the right hand.
  • the buttons 340,350 are configured as trigger type buttons.
  • the motion sensor 420 is built in the housing of the grip 310. If the movement of the user 5 can be detected from around the user 5 by a camera or other device, the grip 310 may not include the motion sensor 420.
  • the frame 320 includes a plurality of infrared LEDs 360 arranged along its circumferential direction.
  • the infrared LED 360 emits infrared rays as the program progresses while the program using the controller 300 is being executed.
  • the infrared rays emitted from the infrared LED 360 can be used to detect each position and posture (tilt, orientation) of the right controller 300R and the left controller.
  • infrared LEDs 360 arranged in two rows are shown, but the number of arrays is not limited to that shown in FIG. 8A.
  • An array with one column or three or more columns may be used.
  • the top surface 330 includes buttons 370 and 380 and an analog stick 390.
  • the buttons 370 and 380 are configured as push buttons. Buttons 370 and 380 accept operations by the thumb of the user 5's right hand.
  • the analog stick 390 accepts an operation in any direction 360 degrees from the initial position (neutral position) in a certain aspect. The operation includes, for example, an operation for moving an object arranged in the virtual space 11.
  • the right controller 300R and the left controller include a battery for driving the infrared LED 360 and other components. Batteries include, but are not limited to, rechargeable, button type, dry cell type and the like.
  • the right controller 300R and the left controller may be connected, for example, to the USB interface of the computer 200. In this case, the right controller 300R and the left controller do not require batteries.
  • the configuration of the controller 300 described above is an example, and the present invention is not limited to this.
  • FIG. 9 is a block diagram showing an example of the hardware configuration of the server 600 according to a certain embodiment.
  • the server 600 includes a processor 610, a memory 620, a storage 630, an input / output interface 640, and a communication interface 650 as main components. Each component is connected to bus 660, respectively.
  • the processor 610 executes a series of instructions included in the program stored in the memory 620 or the storage 630 based on the signal given to the server 600 or when a predetermined condition is satisfied.
  • the processor 610 is implemented as a CPU), GPU, MPU, FPGA or other device.
  • Memory 620 temporarily stores programs and data.
  • the program is loaded from storage 630, for example.
  • the data includes data input to the server 600 and data generated by the processor 610.
  • the memory 620 is realized as a RAM or other volatile memory.
  • Storage 630 permanently retains programs and data.
  • the storage 630 is realized as, for example, a ROM, a hard disk device, a flash memory, or other non-volatile storage device.
  • the program stored in the storage 630 may include a program for providing a virtual space in the HMD system 100, a simulation program, a game program, a user authentication program, and a program for realizing communication with the computer 200.
  • the data stored in the storage 630 may include data and objects for defining the virtual space.
  • the storage 630 may be realized as a removable storage device such as a memory card.
  • a configuration using programs and data stored in an external storage device may be used instead of the storage 630 built into the server 600. With such a configuration, it becomes possible to collectively update programs and data in a situation where a plurality of HMD systems 100 are used, for example, in an amusement facility.
  • the input / output interface 640 communicates a signal with the input / output device.
  • the input / output interface 640 is implemented using USB, DVI, HDMI® and other terminals.
  • the input / output interface 640 is not limited to the above.
  • the communication interface 650 is connected to the network 2 and communicates with the computer 200 connected to the network 2.
  • the communication interface 650 is realized, for example, as a LAN or other wired communication interface, or a WiFi, Bluetooth, NFC or other wireless communication interface.
  • the communication interface 650 is not limited to the above.
  • the processor 610 accesses the storage 630, loads one or more programs stored in the storage 630 into the memory 620, and executes a series of instructions included in the program.
  • the one or more programs may include an operating system of the server 610, an application program for providing the virtual space, game software that can be executed in the virtual space, and the like.
  • the processor 610 may send a signal to the computer 200 to provide virtual space via the input / output interface 640.
  • FIG. 10 is a block diagram showing a computer 200 according to an embodiment as a module configuration.
  • the computer 200 includes a control module 510, a rendering module 520, a memory module 530, and a communication control module 540.
  • the control module 510 and the rendering module 520 are implemented by the processor 210.
  • the plurality of processors 210 may operate as the control module 510 and the rendering module 520.
  • the memory module 530 is realized by the memory 220 or the storage 230.
  • the communication control module 540 is realized by the communication interface 250.
  • the control module 510 controls the virtual space 11 provided to the user 5.
  • the control module 510 defines the virtual space 11 in the HMD system 100 by using the virtual space data representing the virtual space 11.
  • the virtual space data is stored in, for example, the memory module 530.
  • the control module 510 may generate virtual space data or acquire virtual space data from a server 600 or the like.
  • the control module 510 arranges an object in the virtual space 11 by using the object data representing the object.
  • the object data is stored in, for example, the memory module 530.
  • the control module 510 may generate object data or acquire object data from a server 600 or the like.
  • the objects are, for example, an avatar object that is the alter ego of the user 5, a character object, an operation object such as a virtual hand operated by the controller 300, a landscape including forests, mountains, etc. arranged as the story of the game progresses, a cityscape, and animals. Etc. may be included.
  • the control module 510 arranges the avatar object of the user 5 of the other computer 200 connected via the network 2 in the virtual space 11. In a certain aspect, the control module 510 arranges the avatar object of the user 5 in the virtual space 11. In a certain aspect, the control module 510 arranges an avatar object imitating the user 5 in the virtual space 11 based on the image including the user 5. In another aspect, the control module 510 arranges an avatar object in the virtual space 2 that has been selected by the user 5 from among a plurality of types of avatar objects (for example, an object imitating an animal or a deformed human object). do.
  • a plurality of types of avatar objects for example, an object imitating an animal or a deformed human object.
  • the control module 510 specifies the inclination of the HMD 120 based on the output of the HMD sensor 410. In another aspect, the control module 510 identifies the tilt of the HMD 120 based on the output of the sensor 190, which acts as a motion sensor.
  • the control module 510 detects organs (for example, mouth, eyes, eyebrows) constituting the face of the user 5 from the images of the face of the user 5 generated by the first camera 150 and the second camera 160.
  • the control module 510 detects the movement (shape) of each detected organ.
  • the control module 510 detects the line of sight of the user 5 in the virtual space 11 based on the signal from the gaze sensor 140.
  • the control module 510 detects the viewpoint position (coordinate value in the XYZ coordinate system) where the detected line of sight of the user 5 and the celestial sphere of the virtual space 11 intersect. More specifically, the control module 510 detects the viewpoint position based on the line of sight of the user 5 defined by the uvw coordinate system and the position and inclination of the virtual camera 14.
  • the control module 510 transmits the detected viewpoint position to the server 600.
  • the control module 510 may be configured to transmit line-of-sight information representing the line of sight of the user 5 to the server 600. In such a case, the viewpoint position can be calculated based on the line-of-sight information received by the server 600.
  • the control module 510 reflects the movement of the HMD 120 detected by the HMD sensor 410 in the avatar object. For example, the control module 510 detects that the HMD 120 is tilted and tilts and arranges the avatar object. The control module 510 reflects the detected movement of the facial organ on the face of the avatar object arranged in the virtual space 11. The control module 510 receives the line-of-sight information of another user 5 from the server 600 and reflects it in the line-of-sight of the avatar object of the other user 5. In a certain aspect, the control module 510 reflects the movement of the controller 300 on the avatar object and the operation object. In this case, the controller 300 includes a motion sensor for detecting the movement of the controller 300, an acceleration sensor, a plurality of light emitting elements (for example, infrared LEDs), and the like.
  • the controller 300 includes a motion sensor for detecting the movement of the controller 300, an acceleration sensor, a plurality of light emitting elements (for example, infrared LEDs), and the like.
  • the control module 510 arranges an operation object for receiving the operation of the user 5 in the virtual space 11 in the virtual space 11.
  • the user 5 operates, for example, an object arranged in the virtual space 11.
  • the operation object may include, for example, a hand object which is a virtual hand corresponding to the hand of the user 5.
  • the control module 510 moves the hand object in the virtual space 11 so as to be interlocked with the movement of the user 5's hand in the real space based on the output of the motion sensor 420.
  • the manipulation object can correspond to the hand portion of the avatar object.
  • the control module 510 detects a collision when each of the objects arranged in the virtual space 11 collides with another object.
  • the control module 510 can detect, for example, the timing at which the collision area of one object and the collision area of another object touch each other, and when the detection is made, a predetermined process is performed.
  • the control module 510 can detect the timing when the object and the object are separated from the touching state, and when the detection is made, the control module 510 performs a predetermined process.
  • the control module 510 can detect that the object is in contact with the object. For example, when the operation object touches another object, the control module 510 detects that the operation object touches the other object, and performs a predetermined process.
  • the control module 510 controls the image display on the monitor 130 of the HMD 120.
  • the control module 510 arranges the virtual camera 14 in the virtual space 11.
  • the control module 510 controls the position of the virtual camera 14 in the virtual space 11 and the inclination (orientation) of the virtual camera 14.
  • the control module 510 defines the view area 15 according to the inclination of the head of the user 5 wearing the HMD 120 and the position of the virtual camera 14.
  • the rendering module 510 generates a view image 17 to be displayed on the monitor 130 based on the determined view area 15.
  • the view image 17 generated by the rendering module 520 is output to the HMD 120 by the communication control module 540.
  • control module 510 When the control module 510 detects an utterance using the microphone 170 of the user 5 from the HMD 120, the control module 510 identifies the computer 200 to which the voice data corresponding to the utterance is transmitted. The voice data is transmitted to the computer 200 specified by the control module 510. When the control module 510 receives voice data from another user's computer 200 via the network 2, the control module 510 outputs voice (utterance) corresponding to the voice data from the speaker 180.
  • the memory module 530 holds data used by the computer 200 to provide the virtual space 11 to the user 5.
  • the memory module 540 holds spatial information, object information, and user information.
  • Spatial information holds one or more templates defined to provide virtual space 11.
  • the object information includes a plurality of panoramic images 13 constituting the virtual space 11 and object data for arranging the objects in the virtual space 11.
  • the panoramic image 13 may include a still image and a moving image.
  • the panoramic image 13 may include an image of unreal space and an image of real space. Examples of images in unreal space include images generated by computer graphics.
  • User information holds a user ID that identifies user 5.
  • the user ID may be, for example, an IP (Internet Protocol) address or a MAC (Media Access Control) address set in the computer 200 used by the user. In another aspect, the user ID may be set by the user.
  • the user information includes a program for operating the computer 200 as a control device of the HMD system 100.
  • the data and programs stored in the memory module 530 are input by the user 5 of the HMD 120.
  • the processor 210 downloads a program or data from a computer (for example, a server 600) operated by a business operator that provides the content, and stores the downloaded program or data in the memory module 530.
  • the communication control module 540 can communicate with the server 600 and other information communication devices via the network 2.
  • control module 510 and the rendering module 520 may be implemented using, for example, Unity® provided by Unity Technologies.
  • control module 510 and the rendering module 520 can also be realized as a combination of circuit elements that realize each process.
  • the processing in the computer 200 is realized by the hardware and the software executed by the processor 410.
  • Such software may be pre-stored in a hard disk or other memory module 530.
  • the software may be stored on a CD-ROM or other computer-readable non-volatile data recording medium and distributed as a program product. Alternatively, the software may be provided as a downloadable program product by an information provider connected to the Internet or other networks.
  • Such software is read from the data recording medium by an optical disk drive or other data reader, or downloaded from the server 600 or other computer via the communication control module 540, and then temporarily stored in the storage module. ..
  • the software is read from the storage module by the processor 210 and stored in RAM in the form of an executable program.
  • the processor 210 executes the program.
  • FIG. 11 is a sequence chart showing a part of the processing performed in the HMD set 110 according to an embodiment.
  • step S1110 the processor 210 of the computer 200 specifies the virtual space data as the control module 510 and defines the virtual space 11.
  • step S1120 the processor 210 initializes the virtual camera 14. For example, the processor 210 arranges the virtual camera 14 at a predetermined center point 12 in the virtual space 11 in the work area of the memory, and directs the line of sight of the virtual camera 14 in the direction in which the user 5 is facing.
  • step S1130 the processor 210 generates the view image data for displaying the initial view image as the rendering module 520.
  • the generated view image data is output to the HMD 120 by the communication control module 540.
  • step S1132 the monitor 130 of the HMD 120 displays the view image based on the view image data received from the computer 200.
  • the user 5 wearing the HMD 120 can recognize the virtual space 11 when he / she visually recognizes the visual field image.
  • step S1134 the HMD sensor 410 detects the position and tilt of the HMD 120 based on a plurality of infrared rays transmitted from the HMD 120.
  • the detection result is output to the computer 200 as motion detection data.
  • step S1140 the processor 210 specifies the visual field direction of the user 5 wearing the HMD 120 based on the position and the inclination included in the motion detection data of the HMD 120.
  • step S1150 the processor 210 executes the application program and arranges the object in the virtual space 11 based on the instruction included in the application program.
  • step S1160 the controller 300 detects the operation of the user 5 based on the signal output from the motion sensor 420, and outputs the detection data representing the detected operation to the computer 200.
  • the operation of the controller 300 by the user 5 may be detected based on an image from a camera arranged around the user 5.
  • step S1170 the processor 210 detects the operation of the controller 300 by the user 5 based on the detection data acquired from the controller 300.
  • step S1180 the processor 210 generates view image data based on the operation of the controller 300 by the user 5.
  • the generated view image data is output to the HMD 120 by the communication control module 540.
  • step S1190 the HMD 120 updates the view image based on the received view image data, and displays the updated view image on the monitor 130.
  • FIGS. 12A and 12B An avatar object according to the present embodiment will be described with reference to FIGS. 12A and 12B.
  • a user 5A the user of the HMD set 110A is referred to as a user 5A
  • the user of the HMD set 110B is referred to as a user 5B
  • the user of the HMD set 110C is referred to as a user 5C
  • the user of the HMD set 110D is referred to as a user 5D.
  • A is added to the reference code of each component related to the HMD set 110A
  • B is added to the reference code of each component related to the HMD set 110B
  • C is added to the reference code of each component related to the HMD set 110C
  • a D is added to the reference code of each component with respect to 110D.
  • the HMD 120A is included in the HMD set 110A.
  • FIG. 12A is a schematic diagram showing a situation in which each HMD 120 provides the virtual space 11 to the user 5 in the network 2.
  • the computers 200A to 200D provide the virtual spaces 11A to 11D to the users 5A to 5D via the HMDs 120A to 120D, respectively.
  • the virtual space 11A and the virtual space 11B are composed of the same data.
  • the computer 200A and the computer 200B share the same virtual space.
  • the avatar object 6A of the user 5A and the avatar object 6B of the user 5B exist.
  • the avatar object 6A in the virtual space 11A and the avatar object 6B in the virtual space 11B are each equipped with the HMD 120, but this is for the sake of clarity, and in reality, these objects are equipped with the HMD 120. Not.
  • the processor 210A may arrange the virtual camera 14A for capturing the field of view image 17A of the user 5A at the eye position of the avatar object 6A.
  • FIG. 12B is a diagram showing a field image 17A of the user 5A in FIG. 12A.
  • the view image 17A is an image displayed on the monitor 130A of the HMD 120A.
  • the field of view image 17A is an image generated by the virtual camera 14A.
  • the avatar object 6B of the user 5B is displayed on the view image 17A.
  • the avatar object 6A of the user 5A is also displayed in the field of view image of the user 5B.
  • the user 5A can communicate with the user 5B via the virtual space 11A by dialogue. More specifically, the voice of the user 5A acquired by the microphone 170A is transmitted to the HMD17120B of the user 5B via the server 600, and is output from the speaker 180B provided in the HMD120B. The voice of the user 5B is transmitted to the HMD 120A of the user 5A via the server 600, and is output from the speaker 180A provided in the HMD 120A.
  • the operation of the user 5B (the operation of the HMD 120B and the operation of the controller 300B) is reflected in the avatar object 6B arranged in the virtual space 11A by the processor 210A.
  • the user 5A can recognize the operation of the user 5B through the avatar object 6B.
  • FIG. 13 is a sequence chart showing a part of the processing executed in the HMD system 100 according to the present embodiment.
  • the HMD set 110D is not shown in FIG. 13, the HMD set 110D operates in the same manner as the HMD sets 110A, 110B, and 110C.
  • A is added to the reference code of each component related to the HMD set 110A
  • B is added to the reference code of each component related to the HMD set 110B
  • C is added to the reference code of each component related to the HMD set 110C. It shall be attached and D shall be attached to the reference code of each component with respect to the HMD set 110D.
  • step S1310A the processor 210A in the HMD set 110A acquires the avatar information for determining the operation of the avatar object 6A in the virtual space 11A.
  • This avatar information includes information about the avatar such as motion information, face tracking data, and voice data.
  • the motion information includes information indicating a temporal change in the position and inclination of the HMD 120A, information indicating the hand motion of the user 5A detected by the motion sensor 420A or the like, and the like.
  • the face tracking data includes data for specifying the position and size of each part of the face of the user 5A. Examples of the face tracking data include data showing the movement of each organ constituting the face of the user 5A and line-of-sight data.
  • Examples of the voice data include data indicating the voice of the user 5A acquired by the microphone 170A of the HMD 120A.
  • the avatar information may include information for specifying the avatar object 6A or the user 5A associated with the avatar object 6A, information for specifying the virtual space 11A in which the avatar object 6A exists, and the like.
  • Examples of the information that identifies the avatar object 6A and the user 5A include a user ID.
  • Information that identifies the virtual space 11A in which the avatar object 6A exists includes a room ID.
  • the processor 210A transmits the avatar information acquired as described above to the server 600 via the network 2.
  • step S1310B the processor 210B in the HMD set 110B acquires the avatar information for determining the operation of the avatar object 6B in the virtual space 11B and transmits it to the server 600, as in the process in step S1310A.
  • step S1310C the processor 210B in the HMD set 110B acquires the avatar information for determining the operation of the avatar object 6C in the virtual space 11C and transmits it to the server 600.
  • step S1320 the server 600 temporarily stores the player information received from each of the HMD set 110A, the HMD set 110B, and the HMD set 110C.
  • the server 600 integrates the avatar information of all the users (users 5A to 5C in this example) associated with the common virtual space 11 based on the user ID, the room ID, and the like included in each avatar information. Then, the server 600 transmits the integrated avatar information to all the users associated with the virtual space 11 at a predetermined timing. As a result, the synchronization process is executed.
  • the HMD set 110A, the HMD set 110B, and the HMD 11020C can share each other's avatar information at substantially the same timing.
  • each HMD set 110A to 110C executes the process of steps S1330A to S1330C based on the avatar information transmitted from the server 600 to each HMD set 110A to 110C.
  • the process of step S1330A corresponds to the process of step S1180 in FIG.
  • step S1330A the processor 210A in the HMD set 110A updates the information of the avatar object 6B and the avatar object 6C of the other users 5B and 5C in the virtual space 11A. Specifically, the processor 210A updates the position and orientation of the avatar object 6B in the virtual space 11 based on the motion information included in the avatar information transmitted from the HMD set 110B. For example, the processor 210A updates the information (position, orientation, etc.) of the avatar object 6B included in the object information stored in the memory module 540. Similarly, the processor 210A updates the information (position, orientation, etc.) of the avatar object 6C in the virtual space 11 based on the motion information included in the avatar information transmitted from the HMD set 110C.
  • step S1330B the processor 210B in the HMD set 110B updates the information of the avatar objects 6A and 6C of the users 5A and 5C in the virtual space 11B in the same manner as the processing in step S1330A.
  • step S1330C the processor 210C in the HMD set 110C updates the information of the avatar objects 6A, 6B of the users 5A, 5B in the virtual space 11C.
  • FIG. 14 is a block diagram showing a detailed configuration of a module of the computer 200 according to an embodiment.
  • the control module 510 includes a virtual object generation module 1421, a virtual camera control module 1422, an operation object control module 1423, an avatar object control module 1424, a motion detection module 1425, a collision detection module 1426, and a virtual object. It is equipped with a control module 1427.
  • the virtual object generation module 1421 generates various virtual objects in the virtual space 11.
  • the virtual object may include, for example, forests, landscapes, including mountains, animals, etc. that are arranged as the story of the game progresses.
  • the virtual object may include an avatar object, an operation object, and a stage object, a UI (User Interface) object.
  • UI User Interface
  • the virtual camera control module 1422 controls the behavior of the virtual camera 14 in the virtual space 11.
  • the virtual camera control module 1422 controls, for example, the arrangement position of the virtual camera 14 in the virtual space 11 and the orientation (tilt) of the virtual camera 14.
  • the operation object control module 1423 controls the operation object for accepting the operation of the user 5 in the virtual space 11.
  • the user 5 operates, for example, a virtual object arranged in the virtual space 11.
  • the operation object may include, for example, a hand object (virtual hand) corresponding to the hand of the user 5 wearing the HMD 120.
  • the manipulation object may correspond to the hand portion of the avatar object described below.
  • the avatar object control module 1424 reflects the movement of the HMD 120 detected by the HMD sensor 410 in the avatar object. For example, the avatar object control module 1424 detects that the HMD 120 is tilted and generates data for tilting and arranging the avatar object. In one aspect, the avatar object control module 1424 reflects the movement of the controller 300 on the avatar object. In this case, the controller 300 includes a motion sensor for detecting the movement of the controller 300, an acceleration sensor, a plurality of light emitting elements (for example, infrared LEDs), and the like.
  • the avatar object control module 1424 reflects the movement of the hand detected by the motion detection module 1425 on the movement of the operation object (virtual hand that is a part of the hand of the avatar object) arranged in the virtual space 11.
  • the avatar object control module 1424 reflects the movement of the facial organ detected by the motion detection module 1425 on the face of the avatar object arranged in the virtual space 11. That is, the avatar object control module 1424 reflects the movement of the face of the user 5 on the avatar object.
  • the motion detection module 1425 detects the motion of the user 5.
  • the motion detection module 1425 detects the motion of the hand of the user 5, for example, according to the output of the controller 300.
  • the motion detection module 1425 detects the movement of the user 5's body according to the output of the motion sensor attached to the user's body, for example.
  • the motion detection module 1425 can also detect the motion of the facial organs of the user 5.
  • the collision detection module 1426 detects a collision when each of the virtual objects arranged in the virtual space 11 collides with another virtual object.
  • the collision detection module 1426 can detect, for example, the timing at which one virtual object and another virtual object touch each other.
  • the collision detection module 1426 can detect the timing when one virtual object and another virtual object are separated from the touching state.
  • the collision detection module 1426 can also detect that a virtual object is in contact with another virtual object.
  • the collision detection module 1426 detects, for example, that when an operation object touches another virtual object, the operation object and the other object touch each other.
  • the collision detection module 1426 executes predetermined processing based on these detection results.
  • the virtual object control module 1427 controls the behavior of virtual objects other than the avatar object in the virtual space 11. As an example, the virtual object control module 1427 transforms a virtual object. As another example, the virtual object control module 1427 changes the placement position of the virtual object. As another example, the virtual object control module 1427 moves a virtual object.
  • the virtual object control module 1427 detects the pressing of a predetermined button held by the controller 300 in a state where the collision detection module 1426 detects that the virtual object and the operation object (virtual hand) are in contact with each other. Then, the virtual object is associated with the operation object. At this time, the avatar object control module 1424 may control the operation object (virtual hand) to perform an operation targeting the virtual object.
  • the target motion for example, an motion such as holding, grasping, grasping, placing on the palm, or pinching may be performed.
  • the virtual object control module 1427 moves the virtual object together with the movement of the operation object. Control to let.
  • the virtual object control module 1427 cancels the association between the virtual object and the operation object when the pressing of the predetermined button of the controller 300 is detected in the state where the virtual object and the operation object are associated with each other.
  • the avatar object control module 1424 may control the operation object (virtual hand) to release the virtual object.
  • FIG. 15 is a diagram showing an outline of the configuration of the game system 1500 according to the present embodiment.
  • the game system 1500 is a system for providing a game to a plurality of users.
  • the game system 1500 includes a server 600, an HMD set 110B, user terminals 800A, 800C, 800D, and a network 2.
  • the HMD set 110B and each of the user terminals 800A, 800C, and 800D are configured to be communicable with the server 600 via the network 2.
  • the user terminals 800A, 800C, and 800D are collectively referred to as a user terminal 800.
  • the number of user terminals 800 constituting the game system 1500 is not limited to three, and may be two or less or four or more.
  • the HMD set 110B constitutes an embodiment of the character control device in the present invention.
  • the HMD set 110B will also be referred to as a character control device 110B.
  • At least one character control device 110B is provided in the game system 1500.
  • a plurality of character control devices 110B may be provided depending on the number of user terminals 800 that use the service provided by the server 600.
  • One character control device 110B may be provided for one user terminal 800.
  • One character control device 110B may be provided for a plurality of user terminals 800.
  • the user of the HMD set 110B is referred to as the user 5B.
  • the user of the user terminal 800A is referred to as a user 8A
  • the user of the user terminal 800C is referred to as a user 8C
  • the user of the user terminal 800D is referred to as a user 8D.
  • Users 8A, 8C, and 8D are also collectively referred to as user 8.
  • A is attached to the reference code of each component related to the user terminal 800A
  • B is attached to the reference code of each component related to the HMD set 110B
  • C is attached to the reference code of each component related to the user terminal 800C
  • the user terminal is attached.
  • a D is added to the reference code of each component with respect to 800D.
  • the game system 1500 is a system for streaming and distributing a program that the avatar object 6B associated with the user 5B demonstrates in the virtual space 11B from the character control device 110B to each user terminal 800.
  • the avatar object 6B is an example of the first character and the second character in the present invention.
  • the virtual space 11B is shared between the character control device 110B and the plurality of user terminals 800.
  • the virtual spaces 11A, 11C, and 11D are synchronized with the virtual space 11B, respectively.
  • virtual space 11 when it is not necessary to particularly distinguish the virtual spaces 11A, B, C, and D, they are simply described as "virtual space 11".
  • the user 5B controls the avatar object 6B in the character control device 110B to advance the program of the avatar object 6B.
  • the user 8A watches the delivered program of the avatar object 6B through the user terminal 800A.
  • the user 8C watches the delivered program of the avatar object 6B through the user terminal 800C.
  • the user 8D watches the delivered program of the avatar object 6
  • the server 600 (computer, information processing device) may be a general-purpose computer such as a workstation or a personal computer.
  • the hardware configuration of the server 600 is as described with reference to FIG.
  • the character control device 110B may be a computer such as a server, a desktop personal computer, a laptop computer, or a tablet, and a computer group in which these are combined.
  • the hardware configuration of the character control device 110B is as described with reference to FIG.
  • the network 2 to which each device constituting the game system 1500 is connected is composed of various mobile communication systems constructed by the Internet and a wireless base station (not shown). Examples of this mobile communication system include so-called 3G and 4G mobile communication systems, LTE (LongTermEvolution), and wireless networks (for example, Wi-Fi (registered trademark)) that can be connected to the Internet by a predetermined access point. Will be.
  • FIG. 16 is a diagram showing a hardware configuration of the user terminal 800.
  • the user terminal 800 (computer, information processing device) may be a mobile terminal such as a smartphone, a feature phone, a PDA (Personal Digital Assistant), or a tablet computer.
  • the user terminal 800 may be a game device suitable for game play.
  • the user terminal 800 includes a processor 810, a memory 820, a storage 830, a communication interface (IF) 13, an input / output IF 840, a touch screen 870 (display unit), a camera 860, and a distance measuring sensor 880. And prepare.
  • These configurations included in the user terminal 800 are electrically connected to each other by a communication bus.
  • the user terminal 800 may be provided with an input / output IF 840 capable of connecting a display (display unit) configured separately from the user terminal 800 main body in place of or in addition to the touch screen 870.
  • the user terminal 800 may be configured to be communicable with one or more controllers 1020.
  • the controller 1020 establishes communication with the user terminal 800, for example, according to a communication standard such as Bluetooth®.
  • the controller 1020 may have one or more buttons or the like, and transmits an output value based on an input operation of the user 8 to the buttons or the like to the user terminal 800.
  • the controller 1020 may have various sensors such as an acceleration sensor and an angular velocity sensor, and transmits the output values of the various sensors to the user terminal 800.
  • the controller 1020 may have the camera 860 and the distance measuring sensor 880.
  • the user terminal 800 causes the user 8 who uses the controller 1020 to input user identification information such as the name or login ID of the user 8 via the controller 1020, for example, at the start of the game.
  • the user terminal 800 can associate the controller 1020 with the user 8, and can determine which user 8 the output value belongs to based on the source of the received output value (controller 1020). Can be identified.
  • each user 8 grips each controller 1020, so that the one user terminal does not communicate with other devices such as the server 600 via the network 2.
  • Multiplayer can be realized at 800.
  • each user terminal 800 communicates with each other according to a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through a server 600), thereby realizing local multiplayer with a plurality of user terminals 800. You can also do it.
  • a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through a server 600), thereby realizing local multiplayer with a plurality of user terminals 800. You can also do it.
  • the user terminal 800 may further include at least a part of various functions described later described in the server 600.
  • the plurality of user terminals 800 may be provided with various functions described later described in the server 600 in a distributed manner.
  • the user terminal 800 may communicate with the server 600.
  • information indicating a play result such as a result or a win or loss in a certain game may be associated with user identification information and transmitted to the server 600.
  • the controller 1020 may be configured to be detachable from the user terminal 800.
  • a coupling portion with the controller 1020 may be provided on at least one surface of the housing of the user terminal 800.
  • the user terminal 800 may accept the attachment of a storage medium 1030 such as an external memory card via the input / output IF840. As a result, the user terminal 800 can read the program and data recorded in the storage medium 1030.
  • the program recorded on the storage medium 1030 is, for example, a game program.
  • the user terminal 800 may store the game program acquired by communicating with an external device such as the server 600 in the memory 820 of the user terminal 800, or may store the game program acquired by reading from the storage medium 1030 in the memory 820. You may memorize it in.
  • the user terminal 800 includes a communication IF850, an input / output IF840, a touch screen 870, a camera 860, a distance measuring sensor 880, and a speaker 890 as an example of a mechanism for inputting information to the user terminal 800.
  • a communication IF850 an input / output IF840
  • a touch screen 870 a touch screen 870
  • a camera 860 a distance measuring sensor 880
  • a speaker 890 as an example of a mechanism for inputting information to the user terminal 800.
  • Each of the above-mentioned parts as an input mechanism can be regarded as an operation part configured to accept the input operation of the user 8.
  • the operation unit when the operation unit is configured by at least one of the camera 860 and the distance measuring sensor 880, the operation unit detects an object 1010 in the vicinity of the user terminal 800 and performs an input operation from the detection result of the object. Identify.
  • a user 8's hand as an object 1010, a marker having a predetermined shape, or the like is detected, and an input operation is specified based on the color, shape, movement, or type of the object 1010 obtained as a detection result. Will be done.
  • the user terminal 800 performs a gesture (a series of movements of the user 8's hand) detected based on the captured image. Specified and accepted as an input operation of.
  • the captured image may be a still image or a moving image.
  • the user terminal 800 identifies and accepts the operation of the user 8 performed on the input unit 8701 of the touch screen 870 as the input operation of the user 8.
  • the operation unit is configured by the communication IF 850
  • the user terminal 800 identifies and accepts a signal (for example, an output value) transmitted from the controller 1020 as an input operation of the user 8.
  • a signal output from an input device (not shown) different from the controller 1020 connected to the input / output IF840 is specified and accepted as an input operation of the user 8. ..
  • the processor 810 controls the operation of the entire user terminal 800.
  • the processor 610 controls the operation of the entire server 600.
  • the processor 210 controls the operation of the entire character control device 110B.
  • Processors 810, 20 and 30 include a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit).
  • the processor 810 reads the program from the storage 830 and expands it to the memory 820.
  • the processor 610 reads the program from the storage 630 and expands it into the memory 620.
  • the processor 210 reads a program from the storage 230 described later and expands it into the memory 220 described later.
  • Processor 810, processor 610 and processor 210 execute the expanded program.
  • the memories 820, 620 and 220 are the main storage devices.
  • the memories 820, 620, and 220 are composed of storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory).
  • the memory 820 provides a work area to the processor 810 by temporarily storing the program and various data read from the storage 830 described later by the processor 810.
  • the memory 820 also temporarily stores various data generated while the processor 810 is operating according to the program.
  • the memory 620 provides a work area for the processor 610 by temporarily storing various programs and data read from the storage 630 by the processor 610.
  • the memory 620 also temporarily stores various data generated while the processor 610 is operating according to the program.
  • the memory 220 provides a work area for the processor 210 by temporarily storing various programs and data read from the storage 230 by the processor 210.
  • the memory 220 also temporarily stores various data generated while the processor 210 is operating according to the program.
  • the program may be a game program for realizing the game by the user terminal 800.
  • the program may be a game program for realizing the game in cooperation with the user terminal 800 and the server 600.
  • the program may be a game program for realizing the game in cooperation with the user terminal 800, the server 600, and the character control device 110B.
  • the game realized by the cooperation of the user terminal 800 and the server 600 and the game realized by the cooperation of the user terminal 800, the server 600, and the character control device 110B were started in the user terminal 800 as an example. It may be a game that runs on a browser.
  • the program may be a game program for realizing the game by the cooperation of a plurality of user terminals 800.
  • the various data include data related to the game such as user information and game information, and instructions or notifications to be transmitted / received between the devices of the game system 1500.
  • Storages 830, 630 and 230 are auxiliary storage devices.
  • the storages 830, 630 and 230 are composed of a storage device such as a flash memory or an HDD (Hard Disk Drive).
  • Various data related to the game are stored in the storages 830, 630, and 230.
  • the communication IF850 controls the transmission and reception of various data in the user terminal 800.
  • the communication IF 650 controls the transmission / reception of various data in the server 600.
  • the communication IF 250 controls the transmission / reception of various data in the character control device 110B.
  • the communication IFs 850, 650 and 250 control, for example, communication via a wireless LAN (Local Area Network), Internet communication via a wired LAN, a wireless LAN, or a mobile phone network, and communication using a short-range wireless communication or the like. do.
  • the input / output IF840 is an interface for the user terminal 800 to accept data input, and an interface for the user terminal 800 to output data.
  • the input / output IF840 may input / output data via USB (Universal Serial Bus) or the like.
  • the input / output IF 840 may include, for example, a physical button, a camera, a microphone, a speaker, or the like of the user terminal 800.
  • the input / output IF 640 of the server 600 is an interface for the server 600 to receive data input, and an interface for the server 600 to output data.
  • the input / output IF 640 may include, for example, an input unit that is an information input device such as a mouse or a keyboard, and a display unit that is a device that displays and outputs an image.
  • the input / output IF 240 of the character control device 110B is an interface for the character control device 110B to receive data input, and an interface for the character control device 110B to output data.
  • the input / output IF 240 includes, for example, an information input device such as a mouse, keyboard, stick, and lever, a display 430 that displays and outputs images such as a liquid crystal display, and peripheral devices (HMD120, HMD sensor 410, motion sensor 420, controller 300, etc.). ) May include a connection for sending and receiving data to and from.
  • the touch screen 870 of the user terminal 800 is an electronic component that combines an input unit 8701 and a display unit 8702.
  • the input unit 8701 is, for example, a touch-sensitive device, and is configured by, for example, a touch pad.
  • the display unit 8702 is composed of, for example, a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like.
  • the input unit 8701 detects the position where the user 8's operation (mainly a physical contact operation such as a touch operation, a slide operation, a swipe operation, and a tap operation) is input to the input surface, and outputs information indicating the position. It has a function to transmit as an input signal.
  • the input unit 8701 may include a touch sensing unit (not shown).
  • the touch sensing unit may adopt any method such as a capacitance method or a resistance film method.
  • the display 430 of the character control device 110B may be configured by a touch screen in which an input unit and a display unit are combined, similar to the touch screen 870 of the user terminal 800.
  • the user terminal 800 may include one or more sensors for specifying the holding posture of the user terminal 800.
  • This sensor may be, for example, an acceleration sensor, an angular velocity sensor, or the like.
  • the processor 810 can also specify the holding posture of the user terminal 800 from the output of the sensor and perform processing according to the holding posture.
  • the processor 810 may be a vertical screen display in which a vertically long image is displayed on the display unit 8702 when the user terminal 800 is held vertically.
  • the user terminal 800 when the user terminal 800 is held horizontally, it may be a horizontal screen display in which a horizontally long image is displayed on the display unit. In this way, the processor 810 may be able to switch between the vertical screen display and the horizontal screen display according to the holding posture of the user terminal 800.
  • the speaker 890 outputs voice based on voice data.
  • the camera 860 includes an image sensor and the like, and generates a captured image by converting the incident light incident from the lens into an electric signal.
  • the distance measuring sensor 880 is a sensor that measures the distance to the object to be measured.
  • the ranging sensor 880 includes, for example, a light source that emits pulse-converted light and a light receiving element that receives the light.
  • the distance measuring sensor 880 measures the distance to the object to be measured by the timing of light emission from the light source and the timing of receiving the reflected light generated by the light emitted from the light source being reflected by the object to be measured.
  • the ranging sensor 880 may have a light source that emits light having directivity.
  • the camera 860 and the distance measuring sensor 880 may be provided, for example, on the side surface of the housing of the user terminal 800.
  • a ranging sensor 880 may be provided in the vicinity of the camera 860.
  • the camera 860 for example, an infrared camera can be used.
  • the camera 860 may be provided with a lighting device that irradiates infrared rays, a filter that blocks visible light, and the like. This makes it possible to further improve the detection accuracy of the object based on the captured image of the camera 860 regardless of whether it is outdoors or indoors.
  • the processor 810 may perform one or more of the processes shown in the following (1) to (5), for example, on the captured image of the camera 860.
  • the processor 810 performs image recognition processing on the captured image of the camera 860 to specify whether or not the captured image includes the hand of the user 8.
  • the processor 810 may use, for example, a technique such as pattern matching as the analysis technique adopted in the above-mentioned image recognition process.
  • the processor 810 detects the gesture of the user 8 from the shape of the hand of the user 8.
  • the processor 810 specifies, for example, the number of fingers of the user 8 (the number of extended fingers) from the shape of the hand of the user 8 detected from the captured image.
  • the processor 810 further identifies the gesture performed by the user 8 from the number of identified fingers.
  • the processor 810 determines that the user 8 has made a "par" gesture when the number of fingers is five. Further, when the number of fingers is 0 (the finger is not detected), the processor 810 determines that the user 8 has made a “goo” gesture. Further, the processor 810 determines that the user 8 has performed the "choki” gesture when the number of fingers is two. (3) The processor 810 performs image recognition processing on the captured image of the camera 860 to determine whether the finger of the user 8 is in a state where only the index finger is raised or whether the finger of the user 8 is repelled. To detect.
  • the processor 810 is an object 1010 (hand of the user 8) in the vicinity of the user terminal 800 based on at least one of the image recognition result of the captured image of the camera 860 and the output value of the distance measuring sensor 880. Etc.) and the distance between the user terminal 800 and the user terminal 800.
  • the processor 810 may or may not have the user 8's hand near the user terminal 800 (for example, a distance less than a predetermined value) or far away (for example, depending on the size of the user 8's hand identified from the captured image of the camera 860). Detects whether the distance is greater than or equal to a predetermined value.
  • the processor 810 may detect whether the user 8's hand is approaching or moving away from the user terminal 800. (5) When it is found that the distance between the user terminal 800 and the user 8's hand is changed while the user 8's hand is detected based on the image recognition result of the captured image of the camera 860. The processor 810 recognizes that the user 8 is waving his hand in the shooting direction of the camera 860. In the ranging sensor 880, which has a stronger directivity than the shooting range of the camera 860, the processor 810 is waving the user 8 in a direction orthogonal to the shooting direction of the camera when an object is detected or not detected. Recognize that.
  • the processor 810 determines whether or not the user 8 is holding the hand (whether it is a “goo” gesture or another gesture (for example, “par”) by image recognition for the captured image of the camera 860. Is there)? In addition, the processor 810 detects the shape of the user 8's hand and how the user 8 is moving the hand. In addition, the processor 810 detects whether the user 8 is approaching or moving away from the user terminal 800. Such an operation can correspond to an operation using a pointing device such as a mouse or a touch panel, for example. The user terminal 800, for example, moves the pointer on the touch screen 870 in response to the movement of the user 8's hand, and detects the gesture “goo” of the user 8.
  • a pointing device such as a mouse or a touch panel
  • the user terminal 800 recognizes that the user 8 is continuing the selection operation.
  • the continuation of the selection operation corresponds to, for example, the state in which the mouse is clicked and pressed, or the state in which the mouse is touched after the touch-down operation is performed on the touch panel.
  • the user terminal 800 responds to such a series of gestures by a swipe operation (or a drag operation). It can also be recognized as an operation.
  • the user terminal 800 detects a gesture that the user 8 flips a finger based on the detection result of the user 8's hand by the captured image of the camera 860, the user terminal 800 clicks the mouse or touches the touch panel. It may be recognized as an operation corresponding to a tap operation.
  • the game executed by the game system 1500 according to the first embodiment includes a game part in which one or more characters appear and at least one of the characters is operated.
  • the character appearing in the game may be a player character (hereinafter, PC) or a non-player character (hereinafter, NPC).
  • the PC is a character that can be directly operated by the user 8.
  • An NPC is a character that operates according to a game program and operation instruction data, that is, a character that cannot be directly operated by the user 8. In the following, when it is not necessary to distinguish between the two, "character" is used as a generic term.
  • This game includes a live distribution part in which the virtual space 11 is shared between the character control device 110B and the plurality of user terminals 800, and the avatar object 6B operates in response to the operation of the user 5B of the character control device 110B.
  • the operation of the user 5B includes information input via the HMD 120, the HMD sensor 410, the motion sensor 420, and the controller 300 according to the movement of the user 5B.
  • the operation of the user 5B includes an operation by the user 5B for a button of the controller 300.
  • the operation of the user 5B may include an operation input via various input devices (not shown) connected to the computer 200.
  • a first virtual space 11_1 and a second virtual space 11_2 different from the first virtual space 11_1 are provided. Details of the first virtual space 11_1 and the second virtual space 11_2 will be described later.
  • operation instruction data representing the operation of the character is generated by the character control device 110B in response to the operation of the user 5B. Further, the operation instruction data is supplied to the user terminal 800 running the game from a device other than the user terminal 800 at an arbitrary timing. The operation instruction data may be supplied from the character control device 110B to the user terminal 800 via the server 600, for example.
  • the user terminal 800 analyzes (renders) the operation instruction data by using the reception of the operation instruction data as a trigger.
  • the live distribution part is a part in which the user terminal 800 presents to the user 8 a character that operates according to the above-mentioned analyzed operation instruction data in real time. As a result, the user 8 can feel the reality as if the character really exists, and can further immerse himself in the game world and enjoy the game.
  • This game may be composed of a plurality of play parts.
  • the character properties may differ from part to part, such as one character being a PC in one part and an NPC in another part.
  • the genre of the game executed in the game system 1500 is not limited to a specific genre.
  • the game system 1500 can play games of any genre. For example, sports-themed games such as tennis, table tennis, dodgeball, baseball, soccer and hockey, puzzle games, quiz games, RPG (Role-PlayingGame), adventure games, shooting games, simulation games, training games, and It may be an action game or the like.
  • the play mode of this game is not limited to a specific play mode.
  • the game system 1500 may execute a game of any play form. For example, a single-play game by a single user 8, a multi-play game by a plurality of users 8, a battle game in which a plurality of users 8 play against each other, and a cooperative play in which a plurality of users 8 cooperate among the multi-play games. It may be a game or the like.
  • the distribution terminal, the first distribution terminal, and the second distribution terminal in the present invention are also described as a character control device, a first character control device, and a second character control device as an example.
  • the user 5B is applied as an example of the distributor in the present invention
  • the first distributor and the second distributor in the present invention are described as the first user and the second user as an example.
  • the viewing terminal in the present invention is also described as a user terminal as an example.
  • the user 8 is applied as an example of the viewer in the present invention.
  • the avatar object, the first avatar object, and the second avatar object in the present invention are described as a character, a first character, and a second character as an example.
  • the first game part in the present invention is also described as a live game progress part as an example.
  • the second game part in the present invention is also described as a third play part as an example.
  • the third game part in the present invention is also described as a live distribution part as an example.
  • the operation information in the present invention is also described as operation instruction data as an example.
  • the operation instruction data transmitted / received in the live game progress part is applied.
  • the operation indicated by the first operation information in the present invention is also described as the first operation or the second operation as an example.
  • the operation instruction data transmitted / received in the live distribution part is applied.
  • the first condition in the first mini-game and the second condition in the second mini-game are applied.
  • the third problem in the third play part is applied.
  • the first condition is the combination of the first UI objects in a state where it can be recognized that the selected object has been selected” in the present invention
  • the condition that Bingo is established in the bingo game as an example of the second mini game. Is applied.
  • the second condition that the combination of the specified choice object and the first UI object selected by the viewer is the second condition
  • the conditions involved are applied.
  • the first object in the present invention is also described as a predetermined object or a progress button object as an example.
  • the second object in the present invention is also described as "the first object according to the first state” or "the second object according to the second state” as an example.
  • the third object in the present invention is also described as a third object or a fourth object as an example.
  • FIG. 17 is a block diagram showing a functional configuration of a user terminal 800, a server 600, and a character control device 110B included in the game system 1500.
  • Each of the user terminal 800, the server 600, and the character control device 110B is not shown, and has a functional configuration necessary for functioning as a general computer and a functional configuration necessary for realizing a known function in a game. May include.
  • the user terminal 800 has a function as an input device for receiving an input operation of the user 8 and a function as an output device for outputting a game image or sound.
  • the user terminal 800 functions as a control unit 8100 and a storage unit 8200 by the cooperation of the processor 810, the memory 820, the storage 830, the communication IF850, the input / output IF840, and the like.
  • the server 600 has a function of communicating with each user terminal 800 and supporting the user terminal 800 to advance the game. For example, when the user terminal 800 downloads the application related to the game for the first time, the user terminal 800 is provided with the data to be stored in the user terminal 800 at the start of the first game. For example, the server 600 transmits the operation instruction data for operating the character to the user terminal 800.
  • the motion instruction data may include motion capture data that captures the movement of an actor such as a model in advance, or may include voice data that records the voice of an actor such as a voice actor, or causes the character to operate. It may include operation history data indicating the history of input operations for the purpose, or may include a motion command group in which commands associated with the above-mentioned series of input operations are arranged in chronological order.
  • the server 600 may have a function of communicating with each user terminal 800 participating in the game and mediating an exchange between the user terminals 800 and a synchronization control function. Further, the server 600 has a function of mediating between the user terminal 800 and the character control device 110B. As a result, the character control device 110B can supply the operation instruction data to the user terminal 800 or a group of a plurality of user terminals 800 in a timely manner without making a mistake in the destination.
  • the server 600 functions as a control unit 6100 and a storage unit 6200 in cooperation with a processor 610, a memory 620, a storage 630, a communication IF 650, an input / output IF 640, and the like.
  • the character control device 110B has a function of generating operation instruction data for instructing the operation of a character in the user terminal 800 and supplying the operation instruction data to the user terminal 800.
  • the character control device 110B serves as a control unit 2100 (control module 510, rendering module 520) and a storage unit 2200 (memory module 530) in cooperation with a processor 210, a memory 220, a storage 230, a communication IF 250, an input / output IF 240, and the like. Function.
  • the storage unit 8200 stores the game program 831.
  • the storage unit 6200 stores the game program 631.
  • the storage unit 2200 stores the game program 231.
  • the game program 831 is a game program executed by the user terminal 800.
  • the game program 631 is a game program executed by the server 600.
  • the game program 231 is a game program executed by the character control device 110B. By operating each device in cooperation based on the game programs 831, 631, and 231, the game related to the game system 1500 is realized.
  • the game programs 831 and 231 may be stored in the storage unit 6200 and downloaded to the storage units 8200 and 2200, respectively.
  • the storage units 8200, 6200, and 2200 store the game information 132 and the user information 133, respectively.
  • the game information 132 is data that the control units 8100, 6100, and 2100 refer to when executing the game programs 831, 631, and 231, respectively.
  • the user information 133 is data related to the account of the user 8. In the storage units 6200 and 2200, the game information 132 and the user information 133 are stored for each user terminal 800.
  • the control unit 6100 comprehensively controls the server 600 by executing the game program 631 stored in the storage unit 6200. For example, the control unit 6100 transmits various data, programs, and the like to the user terminal 800. The control unit 6100 receives a part or all of the game information or the user information from the user terminal 800. When the game is a multiplayer game, the control unit 6100 may receive a request for synchronization of multiplayer from the user terminal 800 and transmit data for synchronization to the user terminal 800. Further, the control unit 6100 communicates with the user terminal 800 and the character control device 110B as necessary to transmit and receive information.
  • the control unit 6100 functions as a progress support unit 6101 and a sharing support unit 6102 according to the description of the game program 631.
  • the control unit 6100 can also function as other functional blocks (not shown) in order to support the progress of the game on the user terminal 800, depending on the nature of the game to be executed.
  • the progress support unit 6101 communicates with the user terminal 800, and the user terminal 800 supports the progress of various parts included in this game. For example, when the user terminal 800 advances the game, the progress support unit 6101 provides the user terminal 800 with information necessary for advancing the game.
  • the sharing support unit 6102 communicates with a plurality of user terminals 800 and the character control device 110B, and supports the progress of the game and the sharing of the virtual space 11 among these devices. Further, the sharing support unit 6102 may have a function of matching the online user terminal 800 with the character control device 110B. As a result, information can be smoothly transmitted and received between the user terminal 800 and the character control device 110B.
  • the control unit 8100 comprehensively controls the user terminal 800 by executing the game program 831 stored in the storage unit 8200. For example, the control unit 8100 advances the game according to the operations of the game program 831 and the user 8. Further, the control unit 8100 communicates with the server 600 and the character control device 110B as necessary to transmit and receive information while the game is in progress.
  • the control unit 8100 has an operation reception unit 8101, a display control unit 8102, a user interface (hereinafter, UI) control unit 113, an animation generation unit 8104, a game progress unit 8105, and a virtual space control unit 8106 according to the description of the game program 831. , And function as a progress information generation unit 8107.
  • the control unit 8100 can also function as other functional blocks (not shown) in order to advance the game, depending on the nature of the game to be executed.
  • the operation reception unit 8101 detects and accepts the input operation of the user 8 to the input unit 8701.
  • the operation reception unit 8101 determines what kind of input operation has been performed from the action exerted by the user 8 on the console via the touch screen 870 and other input / output IF14s, and outputs the result to each element of the control unit 8100. Output.
  • the operation receiving unit 8101 receives an input operation for the input unit 8701, detects the coordinates of the input position of the input operation, and specifies the type of the input operation.
  • the operation reception unit 8101 specifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as types of input operations. Further, the operation reception unit 8101 detects that the contact input is canceled from the touch screen 870 when the continuously detected input is interrupted.
  • the UI control unit 8103 controls the UI object to be displayed on the display unit 8702 in order to construct the UI.
  • the UI object is a tool for the user 8 to make an input necessary for the progress of the game to the user terminal 800, or a tool for obtaining information output during the progress of the game from the user terminal 800.
  • UI objects are, but are not limited to, icons, buttons, lists, menu screens, and the like.
  • the animation generation unit 8104 generates an animation showing the motion of various objects based on the control mode of various objects. For example, the animation generation unit 8104 may generate an animation or the like that expresses how the character moves as if it were there, moves the mouth, or changes the facial expression.
  • the display control unit 8102 outputs a game screen reflecting the processing result executed by each of the above-mentioned elements to the display unit 8702 of the touch screen 870.
  • the display control unit 8102 may display the game screen including the animation generated by the animation generation unit 8104 on the display unit 8702. Further, the display control unit 8102 may superimpose and draw the above-mentioned UI object controlled by the UI control unit 8103 on the game screen.
  • the virtual space control unit 8106 performs various controls for sharing the virtual space 11 with the character control device 110B.
  • FIG. 18 is a diagram showing a detailed configuration of the virtual space control unit 8106. As shown in FIG. 18, the virtual space control unit 8106 includes a virtual object generation module 8121, a virtual camera control module 8122, an avatar object control module 8124, and a virtual object control module 8127.
  • the virtual object generation module 8121 is configured in the same manner as the virtual object generation module 1421 described with reference to FIG.
  • the virtual object generated in the user terminal 800 synchronizes with the virtual object generated in the character control device 110B.
  • whether or not to synchronize each virtual object may be determined for each virtual object.
  • some virtual objects may not be generated by the user terminal 800.
  • some virtual objects may be generated in a different display form on the user terminal 800.
  • the display form may be, for example, color, transparency, or the like, but is not limited thereto.
  • the virtual camera control module 8122 is configured in the same manner as the virtual camera control module 1422 described with reference to FIG.
  • the virtual cameras 14A, 14C, 14D arranged in the virtual spaces 11A, 11C, 11D realized in the user terminals 800A, 800C, 800D are virtual arranged in the virtual space 11B realized in the character control device 110B. It is different from the camera 14B. That is, the visual field image of the virtual space 11 shared between the character control device 110B and the user terminal 800 may differ between the character control device 110B and the user terminal 800.
  • the positions and directions of the virtual cameras 14A, 14C, 14D may be operable or inoperable by the users 8A, 8C, 8D. If the virtual cameras 14A, 14C, and 14D are inoperable, the positions and directions of the virtual cameras 14A, 14C, and 14D may be the same. In this case, the plurality of users 8 view a common field-of-view image through each user terminal 800. That is, in this case, the plurality of users 8 can view the field image (live image) taken by the common virtual camera. In this embodiment, an example in which the positions and directions of the virtual cameras 14A, 14C, 14D cannot be operated by the users 8A, 8C, 8D and a plurality of users 8 view a common field image will be mainly described.
  • the virtual camera 14 arranged in the first virtual space 11_1 in the character control device 110B will be referred to as a virtual camera 14_B1
  • the virtual camera 14 arranged in the second virtual space 11_2 will be referred to as a virtual camera 14_B2.
  • the virtual camera 14 arranged in the first virtual space 11_1 in the user terminals 800A, 800C, 800D is described as a virtual camera 14_A1
  • the virtual camera 14 arranged in the second virtual space 11_2 is described as a virtual camera 14_A2.
  • the field image based on each of the virtual cameras 14_B1, 14_B2, 14_A1 and 14_A2 is described as the field image 1400_B1, 1400_B2, 1400_A1 and 1400_A2.
  • the avatar object control module 8124 operates according to the operation instruction data with respect to the avatar object control module 1424 described with reference to FIG. 14 instead of operating according to the movement of the HMD 120 or the movement of the controller 300. It is configured in the same way except that the points are different.
  • the operation instruction data is generated in response to the movement of the HMD 120 or the movement of the controller 300 in the character control device 110B, and is distributed to the user terminal 800.
  • the virtual object control module 8127 is configured in the same manner as the virtual object control module 1427 described with reference to FIG. 14, except that it operates according to the operation instruction data.
  • the operation instruction data is generated in the character control device 110B so as to represent the behavior of the virtual object in the character control device 110B, and is distributed to the user terminal 800.
  • the virtual space control unit 8106 analyzes (renders) the operation instruction data and operates the avatar object 6B (first character, second character) or the virtual object based on the analysis result.
  • the virtual space control unit 8106 triggers that the operation instruction data supplied by the character control device 110B is received via the communication IF 33, and sets the avatar object 6B or the virtual object based on the data. Make it work. This makes it possible to show the user 8 a character and a virtual object that operate in real time.
  • the virtual space control unit 8106 includes an avatar object 6B, a virtual camera 14_A1, and a plurality of virtual space control units 8106 in the first virtual space 11_1 shared between the plurality of user terminals 800 including the user terminal 800 and the character control device 110B. Place a virtual object that represents the choices.
  • the virtual space control unit 8106 has an avatar object 6B and a virtual camera in the second virtual space 11_2 shared between the user terminal 800 satisfying a predetermined condition and the character control device 110B among the plurality of user terminals 800. 14_A2 and are arranged.
  • the second character arranged in the second virtual space 11_2 in the present invention an example in which the same avatar object 6B as the first character arranged in the first space is applied will be described.
  • the user terminal 800 shares the second virtual space 11_2 when the predetermined condition is satisfied in the user terminal 800. The details of the predetermined conditions will be described later.
  • the virtual space control unit 8106 displays the field of view image 1400_A1 (first field of view image) according to the setting of the virtual camera 14_A1 arranged in the first virtual space 11_1 on the display unit 8702.
  • the virtual space control unit 8106 displays the field of view image 1400_A2 (second field of view image) according to the setting of the virtual camera 14_A2 arranged in the second virtual space 11_2 on the display unit 8702.
  • the second visual field image is displayed on the user terminal 800 when a predetermined condition is satisfied on the user terminal 800.
  • the first visual field image of the first virtual space 11_1 can be viewed by the user 8 of the plurality of user terminals 800, whereas the second visual field image of the second virtual space 11_2 satisfies the predetermined conditions. It can be viewed only by the user 8 of the user terminal 800.
  • viewing the first visual field image of the first virtual space 11_1 is also described as participating in the first virtual space 11_1. It is also described that viewing the second visual field image of the second virtual space 11_2 participates in the second virtual space 11_2.
  • the virtual space control unit 8106 causes the avatar object 6B to perform an operation of selecting one of the plurality of options based on the operation of the user 5B of the character control device 110B.
  • the game progress unit 8105 advances the game according to either or both of the input operation of the user 8 input via the operation reception unit 8101 and the input operation by the user 5B of the character control device 110B.
  • the game progress unit 8105 causes one or more characters to appear and operates the characters while the game is in progress.
  • the game progress unit 8105 may be operated according to the game program 831 or according to the input operation of the user 8, or may be generated according to the operation of the user 5B of the character control device 110B. It may be operated according to the operation instruction data.
  • the game progress unit 8105 advances the game in the first state based on the first operation performed by the avatar object 6B in the first virtual space 11_1 according to the operation of the user 5B of the character control device 110B. Details of the first operation and the first state will be described later.
  • the game progress unit 8105 changes the progress of the game from the first state to the first state based on a predetermined operation performed by the avatar object 6B in the first virtual space 11_1 according to the operation of the user 5B. Transition to two states. The details of the second state will be described later.
  • the predetermined operation may be, for example, an operation targeting a predetermined virtual object.
  • the target motion may be, for example, a push, hold, point, turn, step on, or the like, but is not limited to this.
  • the predetermined virtual object may be, for example, a button-type object that is a target of a pushing operation, a dial-type object that is a target of a turning operation, or the like, but is not limited thereto.
  • the game progress unit 8105 advances the game in the second state based on the second operation performed by the avatar object 6B in the first virtual space 11_1 according to the operation of the user 5B. The details of the second operation will be described later.
  • the first state and the second state may be, for example, states according to the type of the game in progress.
  • the first state and the second state may include the state of the virtual space 11, the state of the display unit 8702, the application state of the game processing to be executed, and the like.
  • the virtual space 11 has a state in which virtual objects according to the type of the game are arranged.
  • the display unit 8702 has a state in which UI objects corresponding to the type of the game are displayed. Further, it has a state in which game processing according to the type of game is applied according to the operation of the avatar object 6B or the operation on the UI object.
  • the first state and the second state may include other states related to the progress of the game.
  • the game progressing in the first state is also described as the first mini game
  • the game progressing in the second state is also described as the second mini game.
  • the types of states in which this game can be advanced are not limited to two, and may be three or more.
  • the present game may include a mini game in addition to the first mini game and the second mini game.
  • the second mini-game corresponds to an example of the first play part in the present invention.
  • the first mini-game corresponds to an example of the second play part in the present invention.
  • the first condition that can be satisfied by the play of the user 8 is set.
  • the first condition corresponds to an example of the second problem in the present invention.
  • a second condition that can be satisfied by the play of the user 8 is set.
  • the second condition corresponds to an example of the first problem in the present invention.
  • the first operation and the second operation may be, for example, the operation of the avatar object 6B according to the type of the game in progress.
  • the first action may be an action in which the avatar object 6B selects a virtual object corresponding to goo choki par.
  • the second mini-game is a bingo game
  • the second operation may be an operation in which the avatar object 6B selects a virtual object corresponding to the bingo ball.
  • the first state there may be a plurality of types of the first operation according to the type of the game.
  • the operation of the avatar object 6B for selecting the virtual object and the operation of the avatar object 6B for pressing the virtual button for determining the victory or defeat may be regarded as the first operation, respectively.
  • the second state there may be a plurality of types of the second operation according to the type of the game.
  • either of the first operation and any of the second operation may be the same operation or may be different operations.
  • the first state there may be a first action of pressing a virtual button for determining the victory or defeat of the game
  • in the second state there may be a second operation of pressing the virtual button for determining the victory or defeat of the game. ..
  • the game progress unit 8105 executes a game process according to the result of selection by the selected action according to the action of the avatar object 6B selecting any of a plurality of options.
  • the game progress unit 8105 executes a game process according to the result of selection by the selected action according to the action of the avatar object 6B selecting any of a plurality of options.
  • virtual objects representing a plurality of options are arranged.
  • the game progress unit 8105 includes information input by the user 8 of the user terminal 800 when the above-mentioned predetermined condition is not satisfied, and the user 8 of the other user terminal 800 whose predetermined condition is not satisfied.
  • the information input by is displayed on the display unit 8702 together with the first visual field image.
  • the part in which the information input by each user 8 of the user terminal 800 whose predetermined condition is not satisfied is displayed is also referred to as a chat part hereafter.
  • the progress information generation unit 8017 generates progress information indicating the progress of the game being executed by the game progress unit 8105, and sends it to the server 600 or the character control device 110B in a timely manner.
  • the progress information may include, for example, information that specifies the currently displayed game screen, or may include a progress log indicating the progress of the game in chronological order by characters, symbols, and the like.
  • the progress information generation unit 8017 may be omitted.
  • the control unit 2100 comprehensively controls the character control device 110B by executing the game program 231 stored in the storage unit 2200. For example, the control unit 2100 generates operation instruction data according to the operations of the game program 231 and the user 5B, and supplies the operation instruction data to the user terminal 800. The control unit 2100 may further execute the game program 231 as needed. Further, the control unit 2100 communicates with the server 600 and the user terminal 800 running the game to send and receive information.
  • the control unit 2100 has an operation reception unit 2101, a display control unit 2102, a UI control unit 2103, an animation generation unit 2104, a game progress unit 2105, a virtual space control unit 2106, and a reaction processing unit 2107 according to the description of the game program 231. Functions as.
  • the control unit 2100 can also function as other functional blocks (not shown) in order to control the characters appearing in the game according to the nature of the game executed in the game system 1500.
  • the operation reception unit 2101 detects and accepts the input operation of the user 5B.
  • Input operations include, for example, various inputs such as an HMD 120, a motion sensor 420, an input unit integrally formed with a display unit when the display 430 is a touch screen, a mouse (not shown), and a keyboard (not shown). It is input from the device via the input / output IF34.
  • the operation reception unit 2101 determines what kind of input operation has been performed from the action exerted by the user 5B, and outputs the result to each element of the control unit 2100.
  • the details of the function of the operation reception unit 2101 are almost the same as those of the operation reception unit 8101 in the user terminal 800.
  • the UI control unit 2103 controls the UI object to be displayed on the HMD 120 or the display 430.
  • the animation generation unit 2104 generates an animation showing the motion of various objects based on the control mode of various objects.
  • the animation generation unit 2104 may generate an animation or the like that reproduces the game screen actually displayed on the user terminal 800 that is the communication partner.
  • the display control unit 2102 outputs an image reflecting the processing result executed by each of the above-mentioned elements to the HMD 120 or the display 430.
  • the game progress unit 2105 progresses the game based on the avatar object 6B corresponding to the input operation by the user 5B and the operation of the user 8 of the user terminal 800. For example, the game progress unit 2105 performs a predetermined game process when the avatar object 6B performs a predetermined operation. Further, for example, the game progress unit 2105 receives information representing the operation of the user 8 in the user terminal 800, and performs the game processing based on the operation of the user 8.
  • the game progress unit 2105 grasps the progress of the game on the user terminal 800 based on the progress information indicating the progress of the game received from the user terminal 800. Then, the game progress unit 2105 presents the progress of the user terminal 800 to the user 5B by simulating the behavior of the user terminal 800 on the character control device 110B.
  • the game progress unit 2105 may display a reproduction of the game screen displayed on the user terminal 800 on the display unit 352 of its own device. Further, the game progress unit 2105 may display the progress of the game on the display unit 352 as the above-mentioned progress log on the user terminal 800.
  • the game progress unit 2105 grasps the progress of the game of the user terminal 800 based on the progress information. Then, the game progress unit 2105 may completely or simplify the game screen currently displayed on the user terminal 800 based on the game program 831 and reproduce it on the display unit 352 of its own device. Alternatively, the game progress unit 2105 may grasp the progress of the game at the present time, predict the game progress after the present time based on the game program 231 and output the prediction result to the display unit 352.
  • the virtual space control unit 2106 includes modules 1421 to 1427 of the control module 510 described with reference to FIG. These details have already been described.
  • the virtual space control unit 2106 performs various controls for sharing the virtual space 11 with the user terminal 800.
  • FIG. 19 is a diagram showing a detailed configuration of the virtual space control unit 2106.
  • the virtual space control unit 2106 includes modules 1421-1427 described with reference to FIG.
  • the virtual space control unit 2106 operates the avatar object 6B and controls the behavior of the virtual object in the character control device 110B by the operation of these modules 1421 to 1427. Further, the virtual space control unit 2106 generates operation instruction data for operating the avatar object 6B on the user terminal 800 and operation instruction data for controlling the behavior of each virtual object based on the operations of the modules 1421 to 1427. do. These operation instruction data are supplied to the user terminal 800 via the server 600.
  • the virtual space control unit 2106 generates operation instruction data instructing the user 5B (voice actor or the like) to make the controlled character speak based on the voice data input via the microphone 170.
  • the operation instruction data generated in this way includes at least the above-mentioned voice data.
  • the user 5B model or the like
  • the motion instruction data generated in this way includes at least the above-mentioned motion capture data.
  • the user 5B operates the character to be controlled based on the history of the input operation input by the user 5B via the input mechanism such as the controller 300 or the operation unit such as the touch screen, the mouse, and the keyboard, that is, the operation history data. Generates action instruction data instructing to do so.
  • the operation instruction data generated in this way includes at least the above-mentioned operation history data.
  • the operation history data is, for example, information in which operation logs indicating which button of the controller 300 is pressed by the user 5B at what timing when which screen is displayed on the HMD 120 are organized in chronological order. ..
  • the virtual space control unit 2106 identifies a command instructing the operation of the character associated with the input operation input by the user 5B via the above-mentioned input mechanism or operation unit. Then, the virtual space control unit 2106 arranges the commands in the order in which they are input to generate a motion command group indicating a series of motions of the character, and generates motion instruction data instructing the character to operate according to the motion command group. You may.
  • the motion instruction data generated in this way includes at least the above-mentioned motion command group.
  • the reaction processing unit 2107 receives feedback from the user terminal 800 regarding the reaction of the user 8 and outputs this to the user 5B of the character control device 110B.
  • the user terminal 800 allows the user 8 to create a comment addressed to the character while the character is operated according to the above-mentioned operation instruction data.
  • the reaction processing unit 2107 receives the comment data of the comment and outputs it.
  • the reaction processing unit 2107 may display the text data corresponding to the comment of the user 8 on the display unit 352, or may output the voice data corresponding to the comment of the user 8 from a speaker (not shown).
  • the functions of the user terminal 800, the server 600, and the character control device 110B shown in FIG. 17 are merely examples. Each device of the user terminal 800, the server 600, and the character control device 110B may have at least a part of the functions of the other devices. Further, another device other than the user terminal 800, the server 600, and the character control device 110B may be used as a component of the game system 1500, and the other device may be made to execute a part of the processing in the game system 1500. That is, the computer that executes the game program in the present embodiment may be any of a user terminal 800, a server 600, a character control device 110B, and another device other than the user terminal 800, and is realized by a combination of a plurality of these devices. May be done.
  • the functions of the server 600 and the user terminal 800 shown in FIG. 17 are only examples.
  • the server 600 may include at least some of the functions of the user terminal 800.
  • the user terminal 800 may have at least a part of the functions included in the server 600.
  • devices other than the user terminal 800 and the server 600 may be used as components of the game system 1500, and the other devices may be made to execute a part of the processing in the game system 1500. That is, the computer that executes the game program in the present embodiment may be any of the user terminal 800, the server 600, and other devices, or may be realized by a combination of a plurality of these devices.
  • FIG. 20 is a flowchart showing an example of the basic game progress of this game.
  • This game includes three parts.
  • the first is a live game progress part that can be participated by a plurality of user terminals 800.
  • the second is a live distribution part that is distributed to the user terminals 800 that satisfy the predetermined conditions among the user terminals 800 that participated in the live game progress part.
  • the third is a chat part provided to the user terminal 800 that does not satisfy the predetermined condition among the user terminals 800 that participated in the live game progress part.
  • the live game progress part includes the above-mentioned first mini-game (game progressing in the first state) and second mini-game (game progressing in the second state).
  • step S1 the game system 1500 advances the live game progress part.
  • the live game progress part progresses in the first virtual space 11_1.
  • step S2 the game system 1500 advances the first mini-game in the live game progress part.
  • the first mini-game proceeds based on the operation of the avatar object 6B in the first virtual space 11_1 and the operation of the user 8 of the user terminal 800.
  • a game process is executed according to the result of selection by the first action based on the first action in which the avatar object 6B selects any of the plurality of options represented by the virtual object.
  • step S3 the game system 1500 ends the first mini-game and starts the second mini-game when the avatar object 6B performs a predetermined operation in the first virtual space 11_1. That is, the progress of the game transitions from the first state to the second state.
  • step S4 the game system 1500 advances the second mini-game in the live game progress part.
  • the second mini-game progresses based on the operation of the avatar object 6B in the first virtual space 11_1 and the operation of the user 8 of the user terminal 800.
  • a game process is executed according to the result of selection by the second action based on the second action in which the avatar object 6B selects any of the plurality of options represented by the virtual object.
  • step S5 the game system 1500 determines whether or not the predetermined condition is satisfied in the live game progress part. Specifically, the game progress unit 8105 may determine whether or not the second condition is satisfied in the second mini-game.
  • step S6 the game system 1500 advances the live distribution part in the second virtual space 11_2.
  • the avatar object 6B (second character) is arranged in the second virtual space 11_2.
  • the avatar object 6B that operates in response to the operation of the user 5B moves from the first virtual space 11_1 to the second virtual space 11_2 and operates.
  • step S7 the game system 1500 advances the chat part on the user terminal 800.
  • the chat part is a part that proceeds without the operation of the user 5B of the character control device 110B. In other words, the chat part progresses independently of the movement of the avatar object 6B.
  • the chat part may proceed in a state where the first visual field image of the first virtual space 11_1 is displayed on the user terminal 800.
  • the user terminal 800 is configured to execute the following steps based on the game program 831. Specifically, it is shared between the first distribution terminal (character control device 110B) for the first distributor (user 5B) to perform real-time distribution and a plurality of viewing terminals including the viewing terminal (user terminal 800). The step of arranging the first avatar object corresponding to the first distributor and the first virtual camera in the first virtual space (first virtual space 11_1) to be generated, and the first virtual camera while the real-time distribution is performed.
  • Each state of the first game part (live game progress part) that progresses while transitioning between multiple states included in the game based on the game program and the step of displaying the first field image showing the field area of
  • a game process in which a game (first mini-game, second mini-game) in one state is advanced based on the movement of the body of the first distributor and the operation of the viewer, and each state is executed.
  • the step of receiving the first motion information indicating the motion of the first avatar object generated by the first distributor by detecting the movement of the body of the first distributor, and the first motion information are received.
  • the step of updating the first field image so that the first avatar object includes performing the movement indicated by the first movement information, and in response to the reception of the first movement information.
  • the step of determining whether or not the movement indicated by the first motion information is a motion for transitioning the state, and when the motion indicated by the first motion information is a motion for transitioning, the first game part Includes steps to transition the progression from the current state to another.
  • the viewer moves by the action for transitioning the first avatar object that advances the game. It can be recognized that the state of the first game part is switched. Therefore, the sense of presence is increased and the immersive feeling in the game is enhanced.
  • the user terminal 800 is shared between the distribution terminal (character control device 110B) for the distributor (user 5B) to perform real-time distribution and a plurality of viewing terminals including the viewing terminal (user terminal 800).
  • UI user interface
  • the step of receiving the action information indicating the action of the avatar object which is generated by detecting and selecting a part from multiple choice objects arranged in the virtual space, and in response to the action information being received, Selected by the avatar object in response to the steps to update the field image and the motion information received to include the behavior of the avatar object to select a portion of multiple choice objects placed in the virtual space.
  • the user terminal 800 may execute a step of changing the state of the UI object displayed on the display unit of the viewing terminal according to the operation of the distributor.
  • Changing the state of the UI object may mean, for example, changing the UI object between the operable state and the inoperable state, or delete the displayed UI object from the display unit.
  • a UI object that has not been displayed may be newly displayed on the display unit.
  • FIG. 21 is a flowchart showing the flow of processing in the live game progress part executed by each device constituting the game system 1500.
  • the left figure shows the operation of the character control device 110B
  • the right figure shows the operation of the user terminal 800A
  • the broken line arrow connecting the left and right shows the data flow.
  • the data transmitted / received between the character control device 110B and the user terminal 800A is assumed to be performed via the server 600.
  • the character control device 110B transmits information to the user terminal 800A via the server 600 simply means that the character control device 110B transmits information to the user terminal 800A.
  • the user terminal 800A transmits information to the character control device 110B via the server 600 simply means that the user terminal 800A transmits information to the character control device 110B.
  • the description relating to the user terminals 800C and 800D will be similarly described by replacing A at the end of each reference code with C and D in the description relating to the user terminal 800A.
  • step S101 the virtual space control unit 2106 of the character control device 110B defines the first virtual space 11_1B based on the first virtual space data representing the first virtual space 11_1.
  • the first virtual space 11_1B is a virtual space realized by the character control device 110B.
  • the first virtual space data is supplied from, for example, the server 600.
  • Step S101 may be executed in response to an operation instructing the user 5B to start distribution after a predetermined distribution start time.
  • step S201 the virtual space control unit 8106 of the user terminal 800A defines the first virtual space 11_1A based on the first virtual space data representing the first virtual space 11_1.
  • the first virtual space 11_1A is a virtual space realized in the user terminal 800A.
  • the first virtual space data is supplied from, for example, the server 600.
  • Step S201 may be executed based on an operation instructing the user 8A to start viewing at a predetermined distribution start time.
  • the predetermined distribution start time may be notified in advance from the server 600 to the character control device 110B and the user terminal 800A. Further, the predetermined distribution start time may be notified from the character control device 110B to the user terminal 800A.
  • step S102 the virtual space control unit 2106 arranges the virtual camera 14_B1 in the first virtual space 11_1B.
  • the field of view image 1400_B1 by the virtual camera 14_B1 is a field of view image of the first virtual space 11_1B seen from the avatar object 6B.
  • the virtual space control unit 8106 arranges the virtual camera 14_A1 in the first virtual space 11_1A.
  • the virtual camera 14_A1 is arranged in a visible position and direction in a region where the avatar object 6B may exist in the first virtual space 11_1A.
  • the virtual camera 14_A1 may be arranged behind the first virtual space 11_1 in a direction toward the front.
  • the visual field image 1400_A1 by the virtual camera 14_A1 is a visual field image suitable for viewing the operation of the avatar object 6B in the first virtual space 11_1A, and corresponds to an example of the first visual field image in the present invention.
  • step S103 the virtual space control unit 2106 arranges the avatar object 6B and a predetermined object (progress button object) in the first virtual space 11_1B.
  • the progress button object is an example of the predetermined virtual object described above.
  • step S104 the virtual space control unit 2106 transmits information representing the avatar object 6B and the progress button object to the user terminal 800A.
  • step S203 the virtual space control unit 8106 arranges the avatar object 6B and a predetermined virtual object in the first virtual space 11_1A based on the received information.
  • a progress button object to be pressed is arranged as a predetermined virtual object.
  • the avatar object 6B and the progress button object are sequentially arranged between the user terminal 800A and the character control device 110B.
  • step S105 the virtual space control unit 2106 displays and updates the field image 1400_B1 based on the position and direction of the virtual camera 14_B1 that changes according to the movement of the HMD 120.
  • the update of the visual field image 1400_B1 shall be continuously executed according to the movement of the HMD 120.
  • step S204 the virtual space control unit 8106 displays and updates the field image 1400_A1 based on the position and direction of the virtual camera 14_A1. Even in the subsequent steps, the update of the visual field image 1400_A1 shall be continuously executed according to the operation of the avatar object 6B and the behavior of each virtual object.
  • step S106 the virtual space control unit 2106 controls the operation of the avatar object 6B according to the movement of the user 5B.
  • step S107 the virtual space control unit 2106 transmits the operation instruction data representing the operation of the avatar object 6B to the user terminal 800A.
  • step S205 the virtual space control unit 8106 controls the operation of the avatar object 6B based on the received operation instruction data.
  • step S108 and step S206 the game progress unit 2105 and the game progress unit 8105 advance the first mini-game while transmitting and receiving necessary information to each other.
  • the details of the process flow for advancing the first mini-game will be described later.
  • step S109 the game progress unit 2105 determines whether or not the avatar object 6B has performed a predetermined operation according to the movement of the user 5B.
  • the predetermined operation is an operation targeting the virtual object, and here, the operation object is brought into contact with the progress button object. For example, it may be an operation of pressing the progress button object.
  • the step is repeated until it is determined to be Yes. If Yes in step S109, the process of the next step S110 is executed.
  • step S110 the virtual space control unit 2106 transmits operation instruction data representing a predetermined operation to the user terminal 800A.
  • step S207 the virtual space control unit 8106 causes the avatar object 6B to perform a predetermined operation based on the received operation instruction data.
  • the avatar object 6B performs an operation of pressing the progress button object.
  • the progress of the game transitions from the first mini-game to the second mini-game according to the predetermined operation of the avatar object 6B.
  • step S112 and S208 the game progress unit 2105 and the game progress unit 8105 advance the second mini-game while transmitting and receiving necessary information to each other.
  • the details of the process flow for advancing the second mini-game will be described later.
  • FIG. 22 is a diagram schematically showing an example of the first virtual space 11_1B and the visual field image 1400_B1 realized by executing steps S101 to S107 in the character control device 110B.
  • FIG. 22A at least the avatar object 6B, the virtual camera 14_B1, and the progress button object 2241 are arranged in the first virtual space 11_1.
  • User 5B (first user) wears the HMD 120 on his head.
  • User 5B grips the right controller 300R with the right hand (first part) that constitutes the right part of the body of user 5B, and left with the left hand (second part) that constitutes the left part of the body of user 5B. It holds the controller 300L.
  • the HMD 120 includes a sensor 190 that functions as a motion sensor.
  • the right controller 300RB and the left controller 300LB include a motion sensor 420.
  • User 5B is further equipped with motion sensors 1841 to 1843.
  • the motion sensor 1841 is attached to the waist of the user 5B by a belt 1844.
  • the motion sensor 1842 is attached to the instep of the right foot of the user 5B.
  • the motion sensor 1843 is attached to the instep of the left foot of the user 5B of the user 5B.
  • the motion sensors 1841 to 1843 are connected to the computer 200 by wire or wirelessly.
  • the motion sensor mounted on the user 5B detects the arrival time and angle of a signal (for example, an infrared laser) emitted from a base station (not shown).
  • the processor 210 of the computer 200 (hereinafter, simply the processor 210) detects the position of the motion sensor with respect to the base station based on the detection result of the motion sensor.
  • the processor 210 may further standardize the position of the motion sensor with respect to the base station with respect to a predetermined point (eg, the position of the sensor 190 mounted on the head).
  • the avatar object 6B includes a virtual right hand 1831RB and a virtual left hand 1831LB.
  • the virtual right hand 1831RB is a kind of operation object, and can move in the first virtual space 11_1B according to the movement of the right hand of the user 5B.
  • the virtual left hand 1831LB is a kind of operation object, and can move in the first virtual space 11_1B according to the movement of the left hand of the user 5B.
  • the first virtual space 11_1 shown in FIG. 22 (A) is constructed by playing back the program content on the computer 200.
  • User 5B moves his body to cause the avatar object 6B to perform a performance.
  • the computer 200 detects the movement of the user 5B based on the outputs of various motion sensors mounted on the user 5B.
  • the avatar object 6B performs a performance that reflects the movement of the user 5B in the real space according to the movement of the specified user 5B.
  • the avatar object 6B synchronously executes the same performance in the first virtual space 11_1A realized in the user terminal 800A.
  • the user 5B has a role as a distributor for distributing the live game progress part to the user 8A of the plurality of user terminals 800A as a program by the avatar object 6B.
  • the virtual camera 14_B1 is placed on the head of the avatar object 6B.
  • the virtual camera 14_B1 defines a field of view region 15_11 according to the position and orientation of the virtual camera 14_B1.
  • the virtual camera 14_B1 generates a field of view image 1400_B1 corresponding to the field of view region 15B and displays it on the HMD 120 as shown in FIG. 21 (B).
  • the user 5B visually recognizes a part of the first virtual space 11_1B from the viewpoint of the avatar object 6B.
  • the view image 1400_B1 includes a progress button object 2241. Therefore, the user 5B can visually recognize the progress button object necessary for advancing the progress in the live game progress part.
  • FIG. 23 is a diagram schematically showing an example of the first virtual space 11_1A and the visual field image 1400_A1 realized by executing steps S201 to S205 in the user terminal 800.
  • the avatar object 6B, the virtual camera 14_A1, and the progress button object 2241 are arranged in the first virtual space 11_1A.
  • the first virtual space 11_1A is synchronized with the first virtual space 11_1B shown in FIG.
  • the user 8A holds the user terminal 800A in his left hand.
  • the user 8 watches the live game progress part, which is a program demonstrated by the avatar object 6B, while visually observing the screen of the user terminal 800A.
  • the first virtual space 11_1A shown in FIG. 23A is constructed by playing back the program content on the user terminal 800A.
  • the avatar object 6B performs as a distributor of the live game progress part based on the movement of the user 5B.
  • the user 8A watches the performance of the avatar object 6B as a participant of the live game progress part through the screen of the user terminal 800A.
  • the users 8C and 8D watch the performance of the avatar object 6B as a participant of the live game progress part through the screens of the user terminals 800C and 800D. In this way, the live game progress part progressed by the avatar object 6B is simultaneously streamed to a plurality of users 8.
  • the virtual camera 14_A1 is arranged behind the first virtual space 11_1A.
  • the virtual camera 14_A1 defines a field of view region 15A according to the position and orientation of the virtual camera 14_A1.
  • the processor 810 generates a field of view image 1400_A1 corresponding to the field of view region 15A and displays it on the display unit 8702 of the touch screen 870 as shown in FIG. 23 (B).
  • the view image 1400_A1 includes at least the avatar object 6B that performs the performance.
  • the user 8A visually recognizes the avatar object 6B and a part of the first virtual space 11_1A in which the avatar object 6B appears. As a result, the user 8A can obtain a virtual experience as if the avatar object 6B is an actual distributor.
  • FIG. 24 is a flowchart showing the flow of processing in the first mini-game executed by each device constituting the game system 1500.
  • the flow of processing shown in FIG. 24 corresponds to the details of the processing in steps S108 and S206 shown in FIG.
  • the rock-paper-scissors game is applied as the first mini-game. Further, in the first mini-game, the first condition is set.
  • the first condition is that the user 8A wins the avatar object 6B with rock-paper-scissors.
  • the first mini-game (second play part) in which the first condition (second task) is set is performed before the second mini-game (first play part) described later starts.
  • the progress is based on the operation performed by the avatar object 6B in response to the operation of the user 5B of the character control device 110B.
  • step S301 the virtual space control unit 2106 of the character control device 110B arranges the third object in the present invention and the virtual object representing a plurality of options in the first virtual space 11_1B.
  • the third object is a virtual object that represents information about the first state.
  • the third object may be a virtual panel, a virtual display, a virtual blackboard, a virtual electric bulletin board, or the like capable of displaying the information, but is not limited thereto.
  • a panel object representing a virtual panel is arranged as the third object.
  • the panel object displays information about the first mini-game as information about the first state.
  • the information regarding the first mini-game may include, for example, information regarding a rule explanation of the first mini-game, a first condition set in the first mini-game, a reward obtained by satisfying the first condition, and the like. However, it is not limited to these.
  • a virtual object representing a plurality of options a virtual object representing each of the options is arranged.
  • choice objects corresponding to each of rock-paper-scissors goo choki par are arranged.
  • one choice object corresponds to one choice, and the number of choices (here, three) choice objects are arranged.
  • step S302 the virtual space control unit 2106 transmits information on the panel object and the three option objects to the user terminal 800A.
  • step S401 the virtual space control unit 8106 of the user terminal 800A sets the first object (here, the panel object and the three choice objects) corresponding to the first state to the first virtual space based on the received information. Place it at 11_1A. Further, the virtual space control unit 8106 applies the operation of the avatar object 6B targeting the first object as the first operation.
  • the first object here, the panel object and the three choice objects
  • step S402 the game progress unit 8105 displays the first UI object on the display unit 8702.
  • the first UI object is a UI object according to the first state.
  • the first UI object is a user interface object that accepts an operation from the user 8A in the first mini game.
  • three UI buttons corresponding to each of the rock-paper-scissors goo choki par options are displayed.
  • the three UI buttons have an operable state and an inoperable state, and may be in an inoperable state in the initial state.
  • the first UI object is not limited to the UI button, and may be another UI object.
  • step S303 the virtual space control unit 2106 causes the avatar object 6B to operate the avatar object 6B in response to the movement of the user 5B to prompt the user 8A to operate the first UI object.
  • step S304 the virtual space control unit 2106 transmits the operation instruction data representing the operation of the avatar object 6B to the user terminal 800A.
  • step S403 the virtual space control unit 8106 causes the avatar object 6B to perform the prompting operation based on the received operation instruction data.
  • step S404 the game progress unit 8105 executes the first process for satisfying at least a part of the first condition based on the first operation of the avatar object 6B.
  • the game progress unit 8105 executes the first process of making the first UI object operable in response to the operation prompted by the avatar object 6B.
  • the game progress unit 8105 accepts the user's operation on the first UI object.
  • an operation for any of the three UI buttons is accepted.
  • step S305 the virtual space control unit 2106 causes the avatar object 6B to select one of the three choice objects according to the movement of the user 5B.
  • the selected action may be, for example, an action of holding a choice object, pointing, rotating, or a combination of various actions including these actions.
  • step S306 the virtual space control unit 2106 transmits the operation instruction data representing the operation of the avatar object 6B to the user terminal 800A.
  • step S405 the virtual space control unit 8106 causes the avatar object 6B to perform the selected operation based on the received operation instruction data.
  • step S406 the game progress unit 8105 executes a process of making the first UI object inoperable according to the operation selected by the avatar object 6B. This step is a process for not accepting an operation that is "later”.
  • step S407 the game progress unit 8105 determines the result of the first mini-game.
  • the game progress unit 8105 determines the victory or defeat of rock-paper-scissors between the user 5B and the user 8.
  • the option indicated by the UI button operated by the user 8 and the option indicated by the option object selected by the avatar object 6B are compared, and the winning or losing is determined.
  • step S407 If it is determined in step S407 that the player has lost, the process of the next step S408 is omitted, and the process of step S409 described later is executed. If it is determined in step S407 that the player has won, the process of the next step S408 is executed.
  • step S408 the greater the degree of achievement of the first condition (second task) in the first mini-game (second play part), the more the game progress unit 8105 performs the second mini-game (first play part) as a game process.
  • a more advantageous process is executed.
  • the game progress unit 8105 stores in the memory 820 information indicating the right to advance the second mini-game in an advantageous manner in association with the user.
  • step S409 the game progress unit 8105 transmits the result of the first mini-game to the character control device 110B.
  • the result of rock-paper-scissors is sent.
  • step S307 the game progress unit 2105 of the character control device 110B displays information representing the result of the first mini-game of each user 8A on the HMD 120.
  • the information representing the result may be superimposed and displayed on the visual field image 1400_B1, for example.
  • information representing the result of rock-paper-scissors for each user is displayed.
  • the information representing the result of rock-paper-scissors of each user may be aggregated information representing the number of winners, the number of losers, the number of Aiko, etc., or may be the list information of the wins and losses of each user.
  • the information representing the result of rock-paper-scissors may include a user name.
  • FIG. 25 is a diagram schematically showing the first virtual space 11_1 in which the first mini game is in progress.
  • FIG. 25A is a perspective view of the first virtual space 11_1B realized in the character control device 110B.
  • FIG. 25B is a top view of the first virtual space 11_1A realized in the user terminal 800A.
  • the avatar object 6B, the progress button object 2241, the panel object 2551, and the option objects 2552_1 to 2552_3 are arranged in the first virtual spaces 11_1A and 11_1B. Further, a virtual camera 14_B1 is arranged in the first virtual space 11_1B. Further, a virtual camera 14_A1 is arranged in the first virtual space 11_1A.
  • the virtual camera 14_B1 is arranged on the head of the avatar object 6B in the direction in which the avatar object 6B is facing. Since the virtual camera 14_B1 is not arranged in the first virtual space 11_1A, it is not included in the visual field image 1400_A1 displayed on the user terminal 800A.
  • the virtual camera 14_A1 is arranged behind the first virtual space 11_1A toward the front area.
  • the front is assumed to be the left direction of the paper. Since the virtual camera 14_A1 is not arranged in the first virtual space 11_1B, it is not included in the visual field image 1400_B1 displayed by the character control device 110B.
  • the avatar object 6B is arranged so as to face the same direction as the virtual camera 14_B1.
  • the direction of the virtual camera 14_B1 is the direction in which the user 8A virtually exists.
  • the direction of the virtual camera 14_B1 is also simply referred to as the direction of the user 8A.
  • the three choice objects 2552_1 to 2552_3 have an illustration representing a goo choki par drawn on one side of the disk portion at the tip. No illustration is drawn on the other side of these discs.
  • the three choice objects 2552_1 to 2552_3 are arranged so that the surface on which the illustration is not drawn faces the user 8A.
  • the panel object 2551 is arranged in the area in front of the avatar object 6B, in other words, behind the avatar object 6B.
  • the panel object 2551 has a plate shape, and information about the first mini-game is displayed on the surface of the user 8A on the direction side.
  • the progress button object 2241 is arranged in the area on the left side with respect to the direction toward the avatar object 6B (the area above the paper surface in the first virtual space 11_1A shown in FIG. 25 (B)).
  • 26 to 29 are diagrams showing transition examples of the visual field image and the game screen while the first mini game is in progress.
  • (A) represents a field image 1400_B1 displayed on the HMD 120 of the character control device 110B by executing steps S301 to S307.
  • (B) represents a game screen displayed on the display unit 8702 of the user terminal 800A by executing S401 to 410.
  • FIG. 26 is a diagram showing a field image and a game screen at the start of a rock-paper-scissors game.
  • the field image 1400_B1 in the character control device 110B includes the progress button object 2241, the option objects 2552-1 to 2552_3, the virtual right hand 1831RB and the virtual left hand 1831LB which are the hand parts of the avatar object 6B, and the information 2651. And include. Since the panel object 2551 is behind the avatar object 6B, it is not included in the field image 1400_B1.
  • the progress button object 2241 is arranged on the left side when viewed from the avatar object 6B.
  • the choice objects 2552_1 to 2552_3 are arranged so that the surface on which the illustration of Goo Choki Par is drawn can be seen from the avatar object 6B.
  • the virtual right hand 1831RB and the virtual left hand 1831LB of the avatar object 6B itself are included in the field image 1400_B1.
  • the user 5B can easily move the virtual right hand 1831RB and the virtual left hand 1831LB to select the option objects 2552_1 to 2552_3 or the progress button object 2241.
  • Information 2651 is information displayed superimposed on the visual field image 1400_B1.
  • Information 2651 represents the number of users 8 who are currently watching the live game progress part including the first mini-game.
  • the virtual space 11_1 is shared by 132 user terminals 800 including the user terminals 800A, 800C, and 800D, and the number of viewing users 8 (the number of viewers) is 132. ing.
  • the information 2651 may be superimposed and displayed, for example, by being transmitted from the server 600.
  • the avatar object 6B is voiced. It is controlled to utter 2671.
  • the action of notifying the start of the first mini-game is not limited to vocalization, but may be a hand gesture or the like, or may be a combination of various actions.
  • the avatar object 6B Is controlled to utter voice 2672.
  • the operation for prompting the operation of the first UI object is not limited to vocalization, but may be a hand gesture or the like, or may be a combination of various operations.
  • the game screen displayed on the display unit 8702 of the user terminal 800A includes the visual field image 1400_A1 and the UI area 2661.
  • the field image 1400_A1 includes an avatar object 6B, a progress button object 2241, choice objects 2552_1 to 2552_3, and a panel object 2551.
  • the avatar object 6B is facing the user 8A, so it is facing the front. Further, the virtual right hand 1831RB and the virtual left hand 1831LB of the avatar object 6B are included in the field image 1400_A1. Thereby, the user 8A can pay attention to which option object the virtual right hand 1831RB and the virtual left hand 1831LB perform the operation of selecting.
  • the progress button object 2241 is arranged on the left side of the avatar object 6B, that is, on the right side of the field image 1400_A1.
  • the option objects 2552_1 to 2552_3 are arranged so that the surface on which the illustration of Goo Choki Par is drawn cannot be seen from the user 8A.
  • the panel object 2551 is arranged behind the avatar object 6B with the surface on which the information is displayed facing the user 8A.
  • the panel object 2551 displays information representing a description of the rock-paper-scissors game as information regarding the first mini-game. Further, the information may include information representing a reward for winning the rock-paper-scissors game (here, the right to be advantageous in the second mini-game is granted).
  • the UI area 2661 includes UI buttons 2661_1 to 2661_3.
  • Each of the UI buttons 2661_1 to 2661_3 contains an illustration of Goo Choki Par.
  • the process for the UI button 2661_1 is accepted, the process of storing the selection of the goo in the memory 820 is executed.
  • the process for the UI button 2661_2 is accepted, the process of storing the selection of the par in the memory 820 is executed.
  • the process for the UI button 2661_1 is accepted, the process of storing the selection of the choki in the memory 820 is executed.
  • the UI buttons 2661_1 to 2661_3 are inoperable in the initial state.
  • the avatar object 6B included in the field image 1400_A1 is controlled to utter the voices 2671 and 2672. ..
  • the user 8A recognizes that the rock-paper-scissors game starts in response to the action of the avatar object 6B uttering the voice 2671. Further, the user 8A can recognize the rules, rewards, etc. of the rock-paper-scissors game by visually recognizing the panel object 2551.
  • the UI buttons 2661_1 to 2661_3 change from the inoperable state to the operable state according to the operation of the avatar object 6B uttering the voice 2672.
  • the UI buttons 2661_1 to 2661_3 may be displayed in a manner indicating that they have changed to an operable state.
  • the user 8A is motivated to press any of the UI buttons 2661_1 to 2661_3 in response to the action of the avatar object 6B uttering the voice 2672.
  • a new UI button may be displayed or the displayed UI button may be hidden.
  • FIG. 27 is a diagram showing a field image and a game screen after any one of the buttons 2661_1 to 2661_3 is pressed by the user 8A.
  • the field image 1400_B1 in the character control device 110B has information 2751 and information 2752 superimposed on the field image 1400_B1 shown in FIG. 26 (A) instead of the information 2651. different.
  • Information 2751 represents the number of users 8 who are currently watching the live game progress part including the first mini game. Here, the number of viewers has increased by two to 134 since the start of the first mini-game. In this way, the content of the information 2751 is updated in real time based on the information transmitted from the user terminal 800A.
  • Information 2752 represents the number of users 8 who have performed an operation on the first UI object among the users 8 of the plurality of user terminals 800 sharing the first virtual space 11_1. Here, it is shown that 132 people including the user 8A performed the operation. The content of the information 2752 is updated in real time based on the information transmitted from the user terminal 800A.
  • the game screen displayed on the display unit 8702 of the user terminal 800A includes the field image 1400_A1 and the UI area 2661, similar to the game screen shown in FIG. 26 (B).
  • the mode of the UI button 2661_1 in the UI area 2661 has changed.
  • the UI button 2661_1 is displayed in a manner representing a state in which an operation for the button is accepted.
  • the mode may be, for example, a mode in which the color of the UI button is changed, or a mode in which the color of the UI button is changed, but the mode is not limited to these.
  • FIG. 28 is a diagram showing a field image and a game screen after any one of the option objects 2552_1 to 2552_3 is selected by the avatar object 6B.
  • the visual field image 1400_B1 in the character control device 110B includes a state in which the avatar object 6B selects the option object 2552_3 with respect to the visual field image 1400_B1 shown in FIG. 26A, and information 2752. It is different from the point that the information 2852 is superimposed instead of.
  • the avatar object 6B selects the option object 2552_3 by the virtual right hand 1831RB according to the operation of the user 5B. Specifically, here, it is assumed that the virtual space control unit 2106 determines that the virtual right hand 1831RB and the option object 2552_3 are in contact with each other, and detects that the predetermined button held by the controller 300 is pressed. As a result, the virtual right hand 1831RB is associated with the option object 2552_3, and the option object 2552_3 moves together with the virtual right hand 1831RB.
  • the avatar object 6B grasps and lifts the option object 2552_3, and rotates the option object 2552_3 so that the surface on which the illustration of "choki", which is one of the options, is drawn faces the user 8. It shall be.
  • the rotation of the option object 2552_3 in the gripped state may be set to be executed, for example, in response to the pressing of another predetermined button of the controller 300. Further, the rotation of the option object 2552_3 in the gripped state may be set to be executed according to a predetermined movement defined for the virtual right hand 1831RB.
  • the user 5B performs an operation for causing the avatar object 6B to select any of the option objects 2552_1 to 2552_3, and also performs an operation for uttering a voice 2871 with "rock-paper-scissors". As a result, the avatar object 6B is controlled to select the option object 2552_3 and to utter the voice 2871.
  • Information 2852 represents the operation result for the first UI object in the plurality of user terminals 800.
  • the information 2852 represents the number of users 8 who have performed an operation on each of the UI buttons 2661_1 to 2661_3.
  • 43 people, including user 8A press the UI button 2661_1 to select goo, and the other 46 users 8 press the UI button 2661_2 to select par, and the other 45.
  • a human user 8 presses 2661_1 to select a choki. It is desirable that the information 2852 is displayed based on the information transmitted from the user terminal 800A after the avatar object 6B performs the operation of selecting the option object 2552_3.
  • the details of the operation results in the plurality of user terminals 800 may be displayed.
  • the details of the operation result may be list information of the identification information of the user 8 who has selected each option.
  • the list information may be superimposed and displayed.
  • the details of the operation result are not limited to this.
  • the information 2852 is not limited to the aggregated information for each option as shown in the figure.
  • the information 2852 may include the identification information of each of the plurality of users 8 and the identification information of any of the UI buttons 2661_1 to 2661_3 pressed by the user 8 as a list.
  • the information 2852 may be displayed in a scrollable manner.
  • the information 2852 is the identification information of the user 8 and the pressed UI buttons 2661_1 to 2661_3 each time any one of the UI buttons 2661_1 to 2661_3 is pressed on any of the plurality of user terminals 800. It may be updated to additionally include any identification information. In this case, the information 2852 may be automatically scrolled and displayed.
  • the user 5B makes the avatar object 6B select the option object 2552_3, and then performs an operation according to the result of the rock-paper-scissors game on the plurality of user terminals 800.
  • a voice 2872 such as "Congratulations to 43 people who won in Goo! Is uttered.
  • the avatar object 6B is controlled to utter a voice 2872.
  • the game screen displayed on the display unit 8702 of the user terminal 800A includes the visual field image 1400_A1, the UI area 2661, and the information 2865.
  • the avatar object 6B included in the field image 1400_A1 is controlled to utter the voice 2871.
  • the user 8A can pay attention to the operation of the avatar object 6B while hoping that the "choki” that can be won by the "goo” selected by the user 8A is selected. The sense of presence increases.
  • the avatar object 6B included in the visual field image 1400_A1 is controlled to perform an operation of selecting the option object 2552_3 based on the operation instruction data transmitted from the character control device 110B.
  • the surface on which the "choki” is drawn is directed toward the user 8A.
  • the user 8A can recognize that the avatar object 6B has selected the "choki” that he / she will win.
  • the avatar object 6B included in the field image 1400_A1 is controlled to utter the voice 2872.
  • the user 8A can recognize that there are 43 people who have won the rock-paper-scissors game like himself, and the presence of participating in the rock-paper-scissors game together with the other users 8 increases.
  • Information 2865 represents the result of a rock-paper-scissors game by user 8A.
  • the information 2865 is displayed in the form of a balloon including the content of "winning".
  • the content and mode of the information 2865 are not limited to this.
  • FIG. 29 is a diagram showing an example of a field image and a game screen when the rock-paper-scissors game ends and the game transitions to the next mini-game.
  • the field image 1400_B1 in the character control device 110B includes a progress button object 2241, option objects 2552-1 to 2552_3, a virtual left hand 1831LB, and information 2751.
  • the progress button object 2241 is included in the region near the center of the field image 1400_B1.
  • the option objects 2552_1 to 2552_3 are included in the lower right region of the visual field image 1400_B1.
  • the avatar object 6B performs the operation.
  • Voice 2971 is controlled to be uttered.
  • the operation of notifying the transition to the second mini-game is not limited to vocalization, but may be a hand gesture or the like, or may be a combination of various operations.
  • the avatar object 6B is controlled to perform the operation.
  • the game screen displayed on the display unit 8702 of the user terminal 800A includes the visual field image 1400_A1, the UI area 2661, and the information 2981.
  • the avatar object 6B included in the field image 1400_A1 is controlled to utter the voice 2971.
  • the avatar object 6B included in the visual field image 1400_A1 is controlled to perform the operation.
  • the user 8A recognizes that the avatar object 6B transitions to the second mini-game in response to the action of uttering the voice 2971 and pressing the progress button object 2241. In this way, the user 8A can recognize with a sense of reality that the mini-game is switched according to the operation of the avatar object 6B.
  • Information 2981 indicates that the right to advance the next second mini-game in an advantageous manner has been granted by winning the rock-paper-scissors game. This allows the user to recognize that the next second mini-game has become advantageous.
  • FIG. 30 is a flowchart showing the flow of processing in the second mini-game executed by each device constituting the game system 1500.
  • the flow of processing shown in FIG. 30 corresponds to the details of the processing in steps S112 and S208 shown in FIG.
  • the bingo game is applied as the second mini game. Further, in the second mini game, the second condition is set.
  • the second condition is that Bingo is established with the bingo card given to the user 8A.
  • step S501 the virtual space control unit 2106 of the character control device 110B arranges the fourth object in the present invention and the virtual object representing a plurality of options in the first virtual space 11_1B.
  • the fourth object is a virtual object that represents information about the second state.
  • the fourth object may be a virtual panel, a virtual display, a virtual blackboard, a virtual electric bulletin board, or the like capable of displaying the information, but is not limited thereto.
  • a panel object representing a virtual panel similar to the third object is arranged as the fourth object. In other words, when switching from the first mini-game to the second mini-game, the information displayed on the panel object changes.
  • a Bingo ball box object is placed as a virtual object that represents multiple options. It is assumed that the Bingo ball box object contains a plurality of Bingo ball objects each having a number drawn on the surface thereof, and has an opening on the surface from which the Bingo ball object can be taken out.
  • step S502 the virtual space control unit 2106 transmits the information of the panel object and the box object to the user terminal 800A.
  • step S601 the virtual space control unit 8106 of the user terminal 800A arranges a second object (here, a panel object and a box object) according to the second state in the first virtual space 11_1A based on the received information. do. Further, the virtual space control unit 8106 applies the operation of the avatar object 6B targeting the second object as the second operation. The first object corresponding to the first state is deleted from the first virtual space 11_1A.
  • a second object here, a panel object and a box object
  • step S602 the game progress unit 8105 determines whether or not the information indicating the right to advance the second mini-game in an advantageous manner is stored in the memory 820. If No in step S602, the process of step S603 is executed. If Yes in step S602, the process of step S604 is executed.
  • step S603 or S604 the game progress unit 8105 displays the second UI object on the display unit 8702.
  • the second UI object is a UI object according to the second state.
  • the second UI object is a user interface object that accepts an operation from the user 8A in the second mini game.
  • step S603 a normal Bingo card is displayed as the second UI object.
  • step S604 an advantageous bingo card is displayed on the display unit 8702 as the second UI object.
  • a normal bingo card and an advantageous bingo card are cards in which each of the cells arranged in multiple columns and multiple rows contains an arbitrary numerical value. Further, each cell has an operable state and an inoperable state, and is in an inoperable state in the initial state. Further, each cell has an open state and an unopened state, and is in an unopened state in the initial state. It is assumed that each cell is opened by being operated when it is in an operable state. Further, the establishment of Bingo, which is the second condition described above, means that a predetermined number of open cells are lined up vertically, horizontally, or diagonally.
  • the advantageous bingo card is a card that can advance the second mini-game more advantageously than the normal bingo card.
  • an advantageous bingo card may include a plurality of normal bingo cards.
  • an advantageous bingo card is a cell in which the number of numerical values contained in one cell is larger than that of a normal bingo card. It may be many.
  • the second UI object is not limited to the bingo card as described above, and may be another UI object.
  • the numerical value included in each cell of the bingo card displayed in this step shall be determined for each of the plurality of user terminals 800 sharing the first virtual space 11_1. For example, at the time of executing this step, the numerical value drawn in each cell may be randomly selected and determined from the numerical values in an arbitrary range.
  • step S504 the virtual space control unit 2106 causes the avatar object 6B to select the bingo ball object according to the movement of the user 5B.
  • Each of the plurality of bingo ball objects is associated with different identification information (number, etc.).
  • step S505 the virtual space control unit 2106 transmits the operation instruction data representing the operation of the avatar object 6B to the user terminal 800A.
  • step S605 the virtual space control unit 8106 causes the avatar object 6B to select the bingo ball object based on the received operation instruction data.
  • step S606 the game progress unit 8105 determines whether or not the numerical value drawn on the bingo ball object selected by the avatar object 6B is a numerical value included in any of the displayed bingo card squares. .. If No in step S606, step S612, which will be described later, is executed. If Yes in step S606, the next step S607 is executed.
  • step S607 the game progress unit 8105 executes a second process for satisfying at least a part of the second condition based on the second operation of the avatar object 6B.
  • the game progress unit 8105 executes a game process for achieving at least a part of the second condition (the first task in the present invention) according to the operation selected by the avatar object 6B.
  • the game progress unit 8105 executes the second process of making the square including the corresponding numerical value operable according to the operation selected by the avatar object 6B.
  • the second process is executed.
  • step S608 the game progress unit 8105 accepts an operation on a square including the corresponding numerical value and changes the square to an open state.
  • step S609 the game progress unit 8105 notifies the character control device 110B that the square including the corresponding numerical value has been opened.
  • step S505 the game progress unit 2105 displays on the HMD 120 information about the user 8A in which the square including the corresponding numerical value is opened.
  • step S610 the game progress unit 8105 of the user terminal 800A determines whether or not the bingo is established because the square is opened in step S608. If No in step S610, step S612, which will be described later, is executed. If Yes in step S610, the next step S611 is executed.
  • step S611 the game progress unit 8105 notifies the character control device 110B that the bingo has been established.
  • step S506 the game progress unit 2105 displays information on the user 8A for which bingo has been established on the HMD 120.
  • the game progress unit 2105 determines whether or not the game has ended.
  • the end condition of the game may be that the Bingo ball object is selected a predetermined number of times by the avatar object 6B. Further, for example, the end condition of the game may be that a predetermined time has elapsed from the selection of the first bingo ball object. Further, for example, the end condition of the game may be that the number of users 8 for which bingo is established exceeds a predetermined number.
  • the conditions for ending the game are not limited to these.
  • step S507 If No in step S507, the process from step S503 is executed. If Yes in step S507, the process of step S508 is executed.
  • step S508 the game progress unit 2105 notifies the user terminal 800A that the game has ended.
  • step S612 the game progress unit 8105 determines whether or not the end of the game has been notified. This step is also executed when No in step S606 or S610. If No in step S612, the process from step S605 is executed. If Yes in step S612, the process of step S613 is executed.
  • step S613 the game progress unit 8105 displays information representing the result of the bingo game on the display unit 8702.
  • FIG. 31 is a diagram schematically showing the first virtual space 11_1 in which the second mini game is in progress.
  • FIG. 31A is a perspective view of the first virtual space 11_1B realized in the character control device 110B.
  • FIG. 31B is a top view of the first virtual space 11_1A realized in the user terminal 800A.
  • an avatar object 6B, a special button object 3141, a panel object 3151, and a box object 3142 are arranged in the first virtual spaces 11_1A and 11_1B. Further, a virtual camera 14_B1 is arranged in the first virtual space 11_1B. Further, a virtual camera 14_A1 is arranged in the first virtual space 11_1A. The avatar object 6B and the virtual cameras 14_A1 and 14_B1 are as described with reference to FIG. 25.
  • the box object 3142 is a rectangular parallelepiped object in which a plurality of bingo ball objects 3144 are arranged inside.
  • the top surface of the box object 3142 includes an opening 3143.
  • the avatar object 6B can perform an operation of selecting one of a plurality of bingo ball objects 3144 by inserting the virtual right hand 1831RB or the virtual left hand 1831LB into the box object 3142 from the opening 3143.
  • Each of the plurality of bingo ball objects 3144 is a spherical object, and a numerical value is drawn on the surface.
  • the numerical value is preferably one of the numerical values in the range that can be displayed on the square of the bingo card as the second UI object.
  • the numerical values drawn on the plurality of bingo ball objects may be different from each other.
  • the numerical values drawn on some of the plurality of bingo ball objects may be the same.
  • the panel object 3151 is configured in the same manner as the panel object 2551, but information about the second mini game is displayed on the surface on the direction side of the user 8A instead of the information about the first mini game.
  • the special button object 3141 is arranged in the area on the left side with respect to the direction of the avatar object 6B.
  • the “left side” is as described in FIG.
  • 32 to 36 are views showing an example of a field image and a display unit while the second mini-game is in progress.
  • (A) represents a field image 1400_B1 displayed on the HMD 120 of the character control device 110B by executing steps S501 to S508.
  • (B) represents a game screen displayed on the display unit 8702 of the user terminal 800A by executing S601 to 613.
  • FIG. 32 is a diagram showing a field image and a game screen at the start of the bingo game.
  • the field image 1400_B1 in the character control device 110B includes a special button object 3141, a box object 3142, a virtual right hand 1831RB, a virtual left hand 1831LB, and information 2751.
  • Information 2751 is as described with reference to FIG.
  • the special button object 3141 is arranged on the left side when viewed from the avatar object 6B.
  • the box object 3142 is arranged so that the opening 3143 can be seen from the avatar object 6B.
  • the virtual right hand 1831RB and the virtual left hand 1831LB of the avatar object 6B itself are included in the field image 1400_B1. Thereby, the user 5B can easily operate the virtual right hand 1831RB and the virtual left hand 1831LB to select one of the plurality of bingo ball objects 3144 through the opening 3143 of the box object 3142.
  • the avatar object 6B utters a voice 3271. Is controlled.
  • the action of notifying the start of the second mini-game is not limited to vocalization, but may be a hand gesture or the like, or may be a combination of various actions.
  • the game screen displayed on the display unit 8702 of the user terminal 800A includes the visual field image 1400_A1 and the UI area 3261.
  • the field image 1400_A1 includes an avatar object 6B, a special button object 3141, a box object 3142, and a panel object 3151.
  • the avatar object 6B is facing the user 8A, so it is facing the front. Further, the virtual right hand 1831RB and the virtual left hand 1831LB of the avatar object 6B are included in the field image 1400_A1. As a result, the user 8A can pay attention to which bingo ball object 3144 the virtual right hand 1831RB and the virtual left hand 1831LB perform the operation of selecting.
  • the special button object 3141 is arranged on the left side of the avatar object 6B, that is, on the right side of the field image 1400_A1.
  • the box object 3142 is displayed in an opaque state on each side, and the internal bingo ball object is displayed so as not to be visible.
  • each of the plurality of bingo ball objects has a position in the first virtual space 11_1A. As a result, the contact determination between the bingo ball object 3144 and the virtual right hand 1831RB or the virtual left hand 1831LB becomes possible, and the bingo ball object 3144 can be selected.
  • the panel object 3251 is arranged behind the avatar object 6B with the surface on which information is displayed facing the user 8A.
  • the panel object 2551 displays information representing a description of the bingo game as information regarding the second mini-game.
  • the information includes information indicating the rules of the bingo game, the set second condition (for example, the establishment of bingo), and the reward when the bingo is established (here, being able to participate in the second virtual space). It may be included, but is not limited to these.
  • the UI area 3261 includes a normal bingo card 3262.
  • a normal Bingo card 3262 contains 9 squares 3263 arranged in 3 columns and 3 rows. Each cell 3263 contains an integer in the range 1-100.
  • the process of storing the information indicating that the cell 3263 is in the open state in the memory 820 is executed. Each cell 3263 is inoperable in the initial state and is not open.
  • the avatar object 6B included in the field image 1400_A1 is controlled to utter the voice 3271.
  • the user 8A recognizes that the bingo game starts in response to the action of the avatar object 6B uttering the voice 3271.
  • FIG. 33 is a diagram showing another example of the game screen at the start of the bingo game.
  • the game screen displayed on the display unit 8702 of the user terminal 800A includes an advantageous bingo card 3362 instead of the normal bingo card 3262 with respect to the game screen shown in FIG. 32 (B).
  • the advantageous Bingo card 3362 contains 9 squares 3363 arranged in 3 columns and 3 rows. Each cell 3363 contains two integers in the range of 1-100.
  • the mass 3363 is advantageous because it is more likely to be in an open state than the mass 3263 by including two integers.
  • each square 3363 is configured in the same manner as each square 3263 of a normal bingo card 3262.
  • the normal bingo card 3262 is displayed on the user terminal 800A.
  • FIG. 34 is a diagram showing a field image and a game screen when the avatar object 6B performs an operation of selecting one of a plurality of bingo ball objects 3144 according to the operation of the user 5B.
  • the visual field image 1400_B1 in the character control device 110B includes a state in which the avatar object 6B selects one bingo ball object 3144 with respect to the visual field image 1400_B1 shown in FIG. 32 (A). , Information 3452 and 3453 are further superimposed.
  • the avatar object 6B performs an operation of selecting and extracting one of a plurality of bingo ball objects 3144 from the inside of the box object 3142 by the virtual right hand 1831RB according to the operation of the user 5B.
  • the bingo ball object 3144 inside the box object 3142 is not displayed, so that the user 5B cannot recognize which numerical value is drawn on the bingo ball object 3144 until it is taken out. do.
  • the bingo ball object 3144 drawn as “54” is also referred to as a bingo ball object 3144 “54”.
  • the details of the process in which the bingo ball object 3144 "54" is selected by the virtual right hand 1831RB are the same as the process in which the option object 2552_3 is selected by the virtual right hand 1831RB in FIG. 28 (A). Do not repeat.
  • the virtual space control unit 2106 may output information indicating that they are in contact with each other to an output device such as the HMD 120.
  • the user 5B may press a predetermined button of the controller 300 according to the output.
  • the user 5B can easily perform the operation of causing the virtual right hand 1831RB to select the bingo ball object 3144 even if the bingo ball object 3144 is not displayed in the visual field image 1400_B1.
  • a plurality of bingo ball objects 3144 inside the box object 3142 may be displayed in a manner that does not include the numerical value on the surface.
  • the Bingo ball object 3144 may be displayed in a contour-only manner. In this case, since the user 5B can visually recognize the position of each bingo ball object 3144, the operation of causing the virtual right hand 1831RB to select the bingo ball object 3144 can be easily performed.
  • the user 5B performs an operation for causing the avatar object 6B to select one of the plurality of bingo ball objects 3144, and then utters the numerical value "54" drawn on the selected bingo ball object 3144 as a voice 3471. Do the action. As a result, the avatar object 6B is controlled to select the bingo ball object 3144 "54" and to utter the voice 3471.
  • Information 3452 represents the progress of the second mini-game.
  • the information 3452 represents the number of times the avatar object 6B has selected the bingo ball object 3144.
  • the information 3452 is updated every time the avatar object 6B performs an operation of selecting one of the plurality of bingo ball objects 3144.
  • Information 3453 includes operation results for the second UI object in the plurality of user terminals 800.
  • the information 3453 includes the number of users 8 whose bingo card squares have been opened by the bingo ball object 3144 most recently selected by the avatar object 6B, and the number of users 8 whose bingo has been established.
  • 14 users 8 perform an operation to open the square 3263 or 3363 of "54", and one user 8 establishes a bingo.
  • the details of the operation result may be list information of the identification information of the user 8 in which the mass 3263 or 3363 is in the open state.
  • the details of the operation result are not limited to this.
  • the information 3453 is not limited to the aggregated information of the operation results as shown in the figure.
  • the information 3453 may include the identification information of the user 8 of each of the plurality of user terminals 800 in which the mass 3263 or 3363 is open as a list.
  • the information 3453 may be displayed in a scrollable manner.
  • the information 3453 may be updated to additionally include the identification information of the user 8 each time the mass 3263 or 3363 is opened in any of the plurality of user terminals 800. In this case, the information 3453 may be automatically scrolled and displayed.
  • the user 5B performs an operation on the avatar object 6B according to the operation result in the plurality of user terminals 800 in response to the display of the information 3453.
  • a voice 3472 such as "One Bingo person has appeared" is uttered.
  • the information 3453 includes the identification information of the user 8 who has established the bingo
  • the voice 3472 further includes the identification information of the user 8 and has the content such as "User 8A, Bingo!. There may be.
  • the avatar object 6B is controlled to utter a voice 3472.
  • the game screen displayed on the display unit 8702 of the user terminal 800A includes the visual field image 1400_A1 and the UI area 3261.
  • the avatar object 6B included in the field image 1400_A1 selects the bingo ball object 3144 "54". Perform the selected action.
  • the surface on which "54" is drawn is directed toward the user 8.
  • the avatar object 6B included in the field image 1400_A1 is controlled to utter the voice 3471.
  • the user 8A can recognize that the Bingo ball object 3144 on which the numerical value "54" included in his / her bingo card is drawn is selected by the avatar object 6B.
  • the mass 3263 including the "54” changes to an operable state.
  • the mass 3263 that has changed to an operable state may be changed and displayed in a mode indicating that it can be operated.
  • the user 8A can open the mass 3263 by performing an operation on the mass 3263 including "54".
  • the mass 3263 including "54” is displayed in a manner representing an open state.
  • the mode representing the open state may be, for example, a mode in which the color is changed to represent the open state, a mode in which a hole is formed in the region of the square, and the like, but the mode is not limited thereto.
  • the user 8A can pay attention to the operation of the avatar object 6B while hoping that the bingo ball object 3144 on which the numerical value included in the bingo card 3262 is drawn is selected, and the user 8A can feel the presence. Increase.
  • the avatar object 6B included in the field image 1400_A1 is controlled to utter the voice 3472.
  • the user 8A can recognize that the other user 8 has established the bingo, and the presence of participating in the bingo game together with the other user 8 is increased.
  • the voice 3472 includes the identification information of the user 8A as the user 8 who established the bingo or the user 8 who opened the cell, the user 8A gives a special feeling among the plurality of users 8. I can feel it.
  • FIG. 35 is a diagram showing another example of the game screen when the avatar object 6B selects any of the plurality of bingo ball objects 3144 according to the operation of the user 5B.
  • the game screen shown in FIG. 35 is different from the game screen shown in FIG. 34 (B) in that a plurality of bingo ball objects 3144 are displayed in the field image 1400_A1.
  • the visual field image 1400_B1 displayed by the character control device 110B does not display the plurality of bingo ball objects 3144, as in FIG. 34 (A).
  • the user 8A can recognize the position of the bingo ball object 3144 on which the numerical value included in the bingo card 3262 is drawn.
  • FIG. 36 is a diagram showing a field image and a game screen when Bingo is established according to the operation selected by the avatar object 6B according to the operation of the user 5B.
  • the field image 1400_B1 in the character control device 110B differs from the field image 1400_B1 shown in FIG. 34 (A) in the numerical value drawn on the bingo ball object 3144 selected by the avatar object 6B. It is also different in that information 3652, 3653 is displayed instead of information 3452, 3453.
  • the avatar object 6B is taken out by selecting the bingo ball object 3144 "25" from the inside of the box object 3142 according to the operation of the user 5B.
  • the details of the operation of selecting the bingo ball object 3144 “25” are the same as the operation of selecting the bingo ball object 3144 “54” described with reference to FIG. 34 (A).
  • the user 5B performs an operation of uttering the numerical value "25" drawn on the selected bingo ball object 3144 as a voice 3671.
  • the avatar object 6B is controlled to select the bingo ball object 3144 "25” and to utter the voice 3471.
  • Information 3652 is an updated version of the progress of the second mini-game indicated by information 3452. According to the information 3652, the number of selections of the bingo ball object 3144 is 30 times. Here, it is assumed that the number of selections reaches 30 as a condition for ending the bingo game.
  • Information 3653 is an updated operation result for the second UI object in the plurality of user terminals 800 shown in Information 3453. According to the information 3653, 14 users 8 open the square 3263 or 3363 of "25", and a total of 9 users 8 establish Bingo.
  • the user 5B causes the avatar object 6B to perform an operation according to the progress status of the second mini-game and the operation results on the plurality of user terminals 800 according to the display of the information 3452 and 3453.
  • a voice 3672 such as "Bingo is over! There are a total of 9 people in Bingo. Please discuss" is uttered.
  • the information 3653 includes the identification information of the user 8 who has established the bingo is the same as the case where the information 3453 is described.
  • the game screen displayed on the display unit 8702 of the user terminal 800A includes the visual field image 1400_A1, the UI area 3261, and the information 3665.
  • the avatar object 6B included in the field image 1400_A1 performs an operation of selecting the bingo ball object 3144 "25" and utters a voice 3671. Be controlled.
  • the mass 3263 including the "25” changes to an operable state.
  • the user 8A can establish a bingo by opening the square 3263 by performing an operation on the square 3263 including "25".
  • Information 3655 is information indicating that Bingo has been established in the user terminal 800A. Information 3655 is displayed in response to the change of the mass 3263 including "25" to the open state.
  • the user 8A can recognize that he / she has established Bingo.
  • the avatar object 6B included in the field image 1400_A1 is controlled to utter the voice 3672.
  • the user 8A can recognize that nine users 8 including himself / herself have established the bingo, and can feel a special feeling among the plurality of users 8.
  • the voice 3672 includes the identification information of the user 8A who has established the bingo, the user 8A can feel a further special feeling.
  • FIG. 37 is a flowchart showing the flow of processing in the live distribution part executed by each device constituting the game system 1500. The flow of processing shown in FIG. 37 is executed following the processing shown in FIG.
  • the virtual space control unit 2106 of the character control device 110B accepts an operation of moving to the second virtual space 11_2.
  • the operation of moving to the second virtual space 11_2 may be an operation of targeting a predetermined virtual object (for example, a special button object) by the avatar object 6B.
  • the operation of moving to the second virtual space 11_2 may be the movement of the avatar object 6B within a predetermined area (for example, in the vicinity of the virtual door) in the first virtual space 11_1B.
  • the operation of moving to the second virtual space 11_2 may be an operation for a predetermined button of the controller 300.
  • the operation of moving to the second virtual space 11_2B is not limited to these.
  • step S702 the virtual space control unit 2106 defines the second virtual space 11_2B based on the second virtual space data representing the second virtual space 11_2.
  • the second virtual space data is supplied from, for example, the server 600.
  • step S703 the virtual space control unit 2106 arranges the virtual camera 14_B2 in the second virtual space 11_2B.
  • the field-of-view image taken by the virtual camera 14_B2 is a field-of-view image of the second virtual space 11_2B seen from the avatar object 6B.
  • step S704 the virtual space control unit 2106 arranges the avatar object 6B in the second virtual space 11_2B. Further, the avatar object 6B is deleted from the first virtual space 11_1B.
  • step S801 the game progress unit 8105 of the user terminal 800A determines whether or not the predetermined condition is satisfied.
  • the predetermined condition may be whether or not the second condition set in the second mini game is satisfied. If Yes in step S801, step S804, which will be described later, is executed. If No in step S801, the process of the next step S802 is executed.
  • step S802 the game progress unit 8105 determines whether or not the user operation instructing the user 8A to consume the valuable item given to the user 8A based on the payment of the consideration or the payment of the consideration is accepted.
  • the valuable item may be, for example, a virtual currency.
  • step S802 The case of No in step S802 will be described later. If Yes in step S802, the process of the next step S803 is executed.
  • step S803 the game progress unit 8105 executes a billing process or a process of consuming valuable items, and executes the process of step S804.
  • step S804 the game progress unit 8105 accepts an operation instructing participation in the second virtual space 11_2A, and determines whether or not the operation has been accepted. If No in step S804, the process in the user terminal 800A ends. If Yes in step S804, the process of the next step S805 is executed.
  • step S805 the game progress unit 8105 defines the second virtual space 11_2A based on the second virtual space data representing the second virtual space 11_2.
  • the second virtual space data is supplied from, for example, the server 600.
  • step S806 the virtual space control unit 8106 arranges the virtual camera 14_A2 in the second virtual space 11_2A.
  • the position and direction of the virtual camera 14_A2 cannot be operated by the user 8A, and a plurality of users 8 view the common visual field image 1400_A2.
  • step S705 the virtual space control unit 2106 transmits information representing the avatar object 6B to the user terminal 800A.
  • step S807 the virtual space control unit 8106 arranges the avatar object 6B in the second virtual space 11_2A based on the received information. Further, the avatar object 6B is deleted from the first virtual space 11_1A.
  • the second virtual space 11_2 is shared between the user terminal 800A that satisfies the predetermined conditions and the character control device 110B. Further, the avatar object 6B is synchronously arranged between the user terminal 800A and the character control device 110B.
  • step S706 the virtual space control unit 2106 displays and updates the field image 1400_B2 based on the position and direction of the virtual camera 14_B2 that changes according to the movement of the HMD 120.
  • step S808 the virtual space control unit 8106 displays and updates the field image 1400_A2 based on the position and direction of the virtual camera 14_A2.
  • step S707 the virtual space control unit 2106 controls the operation of the avatar object 6B according to the movement of the user 5B.
  • step S708 the virtual space control unit 2106 transmits the operation instruction data representing the operation of the avatar object 6B to the user terminal 800A.
  • step S809 the virtual space control unit 8106 controls the movement of the avatar object 6B based on the received operation instruction data.
  • the avatar object 6B operates in the second virtual space 11_2A based on the operation of the user 5B. Further, the avatar object 6B disappears from the first virtual space 11_1A.
  • the operation of the avatar object 6B may be distributed for the purpose of having the user 8 enjoy it.
  • the operation of the avatar object 6B may include, but is not limited to, singing, dancing, sports skills, and the like.
  • the live distribution part can be viewed only by the user 8 who satisfies the predetermined condition, and here, the user 8 who satisfies the second condition in the second mini game. Therefore, the motivation of the user 8 to satisfy the second condition in the second mini game is improved. Further, for the user 8 who satisfies the second condition, the special feeling of being able to watch the live distribution part increases.
  • FIGS. 38 to 40 are diagrams showing transition examples of the field image and the game screen from the movement from the first virtual space 11_1 to the second virtual space 11_2 until the live distribution part is started.
  • FIG. 38 (A) is a diagram showing an example of a field image 1400_B1 displayed on the HMD 120 of the character control device 110B when moving from the first virtual space 11_1 to the second virtual space 11_2.
  • the field image 1400_B1 in the character control device 110B includes a special button object 3141, a virtual left hand 1831LB, and information 2751.
  • Information 2751 is as described above.
  • the special button object 3141 is included in the area near the center of the field image 1400_B1. Further, the box object 3142 is not included in the field image 1400_B1.
  • the avatar object 6B when the user 5B performs an operation of uttering a voice 3871 notifying that he / she can participate in the second virtual space 11_2 only for the user 8A for which bingo is established, the avatar object 6B produces the voice 3781. It is controlled to speak.
  • the voice 3871 is the content of "inviting nine people from Bingo to a special stage".
  • the operation of notifying that it is possible to participate in the second virtual space 11_2 is not limited to vocalization, but may be a hand gesture or the like, or may be a combination of various operations.
  • the avatar object 6B is controlled to perform the operation. Further, the avatar object 6B is controlled so as to disappear from the first virtual space 11_1B according to the operation.
  • the game screen displayed on the display unit 8702 of the user terminal 800A includes the visual field image 1400_A1 and the UI area 3861.
  • the UI area 3861 is different from the UI area 3261 in that it includes a special UI button 3864.
  • the avatar object 6B included in the field image 1400_A1 is controlled to utter the voice 3871.
  • the avatar object 6B included in the visual field image 1400_A1 performs the operation, and then the first virtual space 11_1 It is controlled to disappear from.
  • the user 8A recognizes that the avatar object 6B has moved from the first virtual space 11_1 to the second virtual space 11_2 in response to the action of the avatar object 6B uttering the voice 3871 and pressing the special button object 3141. do.
  • the special UI button 3864 is an example of a UI object displayed when Bingo is established.
  • the field image 1400_A2 representing the second virtual space 11_2A is displayed.
  • FIG. 39 is a diagram schematically showing the first virtual space 11_1 and the second virtual space 11_2 after the avatar object 6B has moved.
  • FIG. 39 (A) is a perspective view of the first virtual space 11_1 shared between the user terminal 800A and the character control device 110B.
  • the avatar object 6B is controlled to disappear from the first virtual space 11_1 after the operation targeting the special button object 3141 is performed. Therefore, the first virtual space 11_1 shown in FIG. 39 (A) includes the panel object 3151, the special button object 3141, and the box object 3142, but does not include the avatar object 6B.
  • FIG. 39B is a perspective view of the second virtual space 11_2 shared between the user terminal 800A and the character control device 110B.
  • the second virtual space 11_2 includes the avatar object 6B.
  • the avatar object 6B may be displayed in a manner different from that when it exists in the first virtual space 11_1.
  • the different aspects may be, for example, different clothes and belongings (here, a microphone), but are not limited to this.
  • the special feeling of the live distribution part progressing in the second virtual space 11_2 is improved.
  • FIG. 40 is a diagram showing an example of the visual field image 1400_A2 displayed on the user terminal 800A.
  • the field image 1400_A2 includes an avatar object 6B that operates in the second virtual space 11_2A.
  • the field-of-view image 1400_A2 can be viewed only by the user 8 who has established Bingo.
  • FIG. 41 is a flowchart showing the flow of processing in the chat part executed by the user terminal 800A. The processing flow shown in FIG. 41 is executed when No is obtained in step S802 of FIG. 37.
  • step S901 the game progress unit 8105 displays the UI object corresponding to the chat part on the display unit 8702.
  • the UI object includes a field that accepts the input of a comment and a button that accepts an operation instructing the transmission of the comment.
  • step S902 the game progress unit 8105 acquires information representing a comment input by the user 8A.
  • step S903 the game progress unit 8105 transmits information representing the acquired comment to the server 600 in response to the operation of the user 8A instructing to transmit the comment.
  • step S904 the game progress unit 8105 superimposes and displays the information representing the acquired comment on the visual field image 1400_A1.
  • step S905 the game progress unit 8105 acquires information representing a comment input on the other user terminal 800 by the other user 8 whose predetermined condition is not satisfied.
  • the information can be acquired by being transmitted from the server 600.
  • step S906 the game progress unit 8105 superimposes and displays the information representing the comment input by the other user 8 on the visual field image 1400_A1.
  • step S907 the game progress unit 8105 determines whether or not to end the chat part.
  • the chat part may be determined to end the chat part when the operation for instructing the end of the chat part is received by the user 8A.
  • step S907 the process from step S902 is executed. If Yes in step S907, the process ends.
  • the game progress unit 8105 superimposes and displays the information representing the comment input by the user 8A and the information representing the comment input by the other user 8 on the visual field image 1400_A1.
  • FIG. 42 is a diagram showing an example of a game screen in a chat part.
  • the display unit 8702 includes a field image 1400_A1, a UI area 3261, a field 4231, a transmit button 4232, and comments 4251 to 4254.
  • the field-of-view image 1400_A1 and the UI area 3261 are configured in the same manner as when the transition from the first mini-game to the second mini-game (FIG. 38 (B)).
  • the field 4231 and the send button 4232 are examples of UI objects corresponding to the chat part.
  • Field 4231 is a UI object that accepts comment input.
  • the send button 4232 is a UI object that receives a comment transmission instruction. The information input in the field 4231 is transmitted to the server 600 when the operation for the transmission button 4232 is accepted.
  • the comments 4251 and 4254 are information representing comments input by the other user 8 on the other user terminal 800. These information are received from another user terminal 800 via the server 600.
  • the comments 4252 and 4253 are information representing comments input by the user 8A in the user terminal 800A. This information is transmitted to another user terminal 800 via the server 600.
  • the comments 4251 and 4254 are superimposed and displayed on the visual field image 1400_A1 in the order in which they are received by the user terminal 800. Further, the information input by the other user 8 is displayed on the left side of the game screen, and the information input by the user 8A is displayed on the right side of the game screen.
  • the third task is set before the first play part (second mini game) is started, regardless of the operation of the first user (user 5B). You may proceed with the third play part that has been done.
  • the game progress unit 8105 of the user terminal 800 has a higher degree of achievement of the third task in the third play part, the result of selection by the operation selected by the avatar object 6B in the first play part (second mini game).
  • the third play part may be, for example, a role-playing game, a quiz game, an action game, a game using position information, or a game of a game genre.
  • the third task in the third play part is, for example, a game task set for the player in the game play, that the player acquires a predetermined game item, and a condition for clearing a predetermined play unit (so-called quest, etc.). Includes meeting, achieving a given score, and other tasks.
  • the better the result of the first mini-game (Janken) the more advantageous the second mini-game (bingo game) is to the player (bingo card so that the bingo card is more likely to become bingo). (Multiple numbers are associated with each square), but in this way, the better the play result of the third play part, the more advantageous the progress of the first mini game or the second mini game is to the player. You may do so.
  • the operation of the player for the billing process may be an operation of purchasing a paid virtual currency in the game program, or the player consumes the paid virtual currency (or obtains a game object by the billing process).
  • the game object may be instructed to be placed in the virtual space where the avatar object 6B is placed (so-called throwing money, changing clothes to the avatar object 6B, etc.).
  • the user 8 obtains the game object for example, the status of the game object managed in association with the user 8 changes from unusable to available.
  • the game object may be stored in at least one of the memories included in the system 1500 in association with the user identification information, the user terminal ID, or the like.
  • the third play part is provided as a part that progresses independently of the operation of the avatar object 6B in response to the operation of the user 5B. Further, the third play part can be played before the live game progress part including the second mini game starts.
  • the third play part may be a single play part that progresses based on the operation of the user 8 of the user terminal 800. Further, the third play part may be a multi-play part that progresses based on the operation of the user 8 of the plurality of user terminals 800. Further, the third play part may proceed using a character that operates without the operation of the user 5B.
  • the positions and directions of the virtual cameras 14_A1 and 14_A2 arranged in the first virtual space 11_1A and the second virtual space 11_2A realized in the user terminal 800 may be operable by the user 8. ..
  • the user may be able to control the positions and directions of the virtual cameras 14_A1 and 14_A2 by operating the display unit 8702 on which the visual field images 1400_A1 and 1400_A2 are displayed.
  • the virtual cameras 14_A1 and 14_A2 may be controlled so as not to be arranged in a predetermined area including the avatar object 6B.
  • the area of the virtual stage where the avatar object 6B exists is defined as the predetermined area.
  • the user 8 can have a virtual experience in which the avatar object 6 can be viewed by moving to a free position in an area other than the stage.
  • the option objects in the first mini-game and the second mini-game may represent a plurality of options with one option object.
  • one dice object represents six options from 1 to 6. If it is a coin object, one coin object represents two options, front and back.
  • an operation corresponding to the option object such as an operation of rolling the dice object and an operation of throwing a coin object is applied.
  • the choice object includes a rolling behavior, a rising behavior, and then a falling behavior. The behavior is determined according to.
  • the numerical value of the bingo ball object may be determined.
  • the determined number may be displayed on the surface of the selected bingo ball object.
  • at least one bingo ball object may be arranged in the box object 3142.
  • the degree to which the bingo game, which is the second mini-game, becomes advantageous may be determined step by step according to the degree to which the first condition is satisfied in the first mini-game. For example, in a rock-paper-scissors game, when rock-paper-scissors is played a plurality of times, a number of bingo cards corresponding to the number of times the user 8 wins may be given to the user in the bingo game.
  • the user terminal 800 does not have to have the virtual space control unit 8106.
  • the character control device 110B may generate a field image for the user terminal 800 in addition to the field image displayed on the HMD 120 in the device. Further, in this case, the visual field image for the user terminal 800 is reproduced by being delivered to the user terminal 800.
  • the virtual camera 14_A1 is arranged in addition to the virtual camera 14_B1 in the first virtual space 11_1B realized by the character control device 110B.
  • the character control device 110B may generate a visual field image 1400_A1 based on the virtual camera 14_A1 and distribute it to the user terminal 800A.
  • the visual field image 1400_A2 corresponding to the second virtual space 11_2B can also be generated and distributed by the character control device 110B in the same manner.
  • the visual field images 1400_A1 and 1400_A2 may be generated in real time according to the operation of the user 5B of the character control device 110B and distributed in real time, respectively. Further, the visual field images 1400_A1 and 1400_A2 may be distributed as pre-generated recorded images, respectively.
  • different characters may be applied as the first character and the second character in the present invention.
  • the user 8 who satisfies the predetermined conditions in the live game progress part can watch the live distribution part by the avatar object different from the avatar object 6B which has progressed the live game progress part.
  • different character control devices may be applied as the first character control device and the second character control device in the present invention.
  • the HMD set 110C may be applied as the second character control device.
  • the live distribution part proceeds based on the operation of the user 5C (second user).
  • the comments, icons, and the like posted by each user 8 may be confirmed to the user 5 who is the distributor at any time.
  • the user 5 is a comment posted by each user 8 through the avatar object 6.
  • the comments, icons, and the like posted by each user 8 may be reproduced even during the execution of the "return delivery" and the "missing delivery” described later.
  • the virtual space (VR space) in which the user is immersed by the HMD has been described as an example, but a transmissive HMD may be adopted as the HMD.
  • a transmissive HMD may be adopted as the HMD.
  • AR augmented reality
  • AR mixed reality
  • AR mixed reality
  • AR mixed reality
  • MR mixed reality
  • the processor may specify the coordinate information of the position of the user's hand in the real space, and may define the position of the target object in the virtual space in relation to the coordinate information in the real space.
  • the processor can grasp the positional relationship between the user's hand in the real space and the target object in the virtual space, and can execute the processing corresponding to the above-mentioned collision control or the like between the user's hand and the target object. ..
  • the user 8 even after the real-time live distribution in the live game progress part, the live distribution part, or the chat part (hereinafter, abbreviated as “live distribution part, etc.”) is completed. Requests the progress of the completed live distribution part or the like, and can re-progress the live distribution part or the like based on the received operation instruction data. As a result, the user 8 can look back at the live distribution again, and even if he / she misses it, he / she can see the live distribution again. In the following, the scene after the end of the live distribution time is assumed.
  • the user terminal 800 is configured to execute the following steps in order to improve the interest of the game based on the game program 831.
  • the user terminal 800 (computer) has a step of requesting the progress of a completed live distribution part or the like via an operation unit such as an input unit 8701, and a completed live distribution part from the server 600.
  • the step of receiving the recorded action instruction data related to the above, and the step of advancing the completed live distribution part or the like by operating the character (avatar object 6) based on the recorded action instruction data are executed. ..
  • the recorded action instruction data includes motion data and voice data input by the user 5 associated with the character (avatar object 6).
  • the recorded operation instruction data is preferably stored in the storage unit 6200 of the server 600, and is preferably delivered to the user terminal 800 again in response to a request from the user terminal 800.
  • the progress of the completed live distribution part or the like based on the recorded operation instruction data is different depending on the result of whether or not the user 8 has progressed the live distribution part or the like in real time. good. Specifically, when it is determined that the user 8 has a track record of advancing the live distribution part or the like in real time, the live distribution part or the like similar to that in which the user 8 has advanced the live distribution part or the like in real time. It is better to proceed again (return delivery). In the return distribution, it is better to execute the selective progress such as the live distribution part.
  • the user 8 when it is determined that the user 8 has no record of progressing the live distribution part or the like in real time, it is better to proceed with the live distribution part or the like having a progress mode different from that progressed in real time (overlooked). delivery).
  • the live distribution part or the like having a progress mode different from that progressed in real time (overlooked). delivery).
  • the real-time live distribution part or the like can proceed. This includes cases where you did not actually do this, even though it was there. In the missed delivery, it is better to execute a limited progress such as a live delivery part.
  • the game progress unit 8105 when it is determined that the user 8 has a track record of advancing the live distribution part or the like in real time, the game progress unit 8105 further determines the user behavior history information in the live distribution part or the like. Is received and analyzed.
  • the user action history information is a data set of user 8 actions recorded by an input operation during the progress of the live distribution part or the like, in addition to the contents of the recorded action instruction data.
  • the user action history information is often associated with the recorded action instruction data, and is preferably stored in the storage unit 6200 of the server 600 or the storage unit 2200 of the character control device 110B. In addition to or instead of this, the user action history information may be stored in the storage unit 8200 of the user terminal 800.
  • FIG. 43 is a diagram showing an example of a data structure of operation instruction data processed by the game system 1500 according to the present embodiment.
  • the action instruction data includes each item of "destination” and “creator” which is meta information, and each item of "character ID", "voice” and “movement” which are contents of the data. It is composed of.
  • the destination designation information is stored in the item "destination".
  • the destination designation information is information indicating to which device the operation instruction data is transmitted.
  • the destination designation information may be, for example, an address unique to the user terminal 800, or may be identification information of the group to which the user terminal 800 belongs. It may be a symbol (for example, "ALL") indicating that the destination is all user terminals 800 satisfying a certain condition.
  • the creation source information is stored in the item "creation source".
  • the creation source information is information indicating which device created the operation instruction data.
  • the creation source information is information related to a user (hereinafter referred to as user-related information) that can identify a specific user, such as a user ID, a user terminal ID, and a unique address of the user terminal.
  • the creation source information may be an ID or an address indicating the server 600 or the character control device 110B, and if the creation source is the server 600 or the character control device 110B, the value of the item is left empty. It may be left, or the item itself may not be provided in the operation instruction data.
  • character ID a character ID for uniquely identifying a character (including an avatar object 66) appearing in this game is stored.
  • the character ID stored here represents which character's action is indicated by the action instruction data.
  • the item "voice” stores voice data to be expressed in the character.
  • Motion data that specifies the movement of the character is stored in the item "movement".
  • the motion data may be motion capture data acquired by the character control device 110B via the motion sensor 420 or the like.
  • the motion capture data may be data that tracks the movement of the entire body of the actor (including user 5), data that tracks the facial expression and mouth movement of the actor, or both. It may be.
  • the motion data may be a motion command group instructing a series of movements of the character specified by an operation input by the user 5 using the controller 300.
  • buttons A, B, C, and D of the controller 300 are assigned to the buttons A, B, C, and D of the controller 300, respectively, the operator.
  • the button A, the button B, the button C, and the button D are pressed in succession.
  • a group of motion commands in which the commands "raise right hand”, “raise left hand”, “walk”, and “run” are arranged in the above order is stored in the "movement" item as motion data. Will be done.
  • the voice data and the motion data are included in the operation instruction data in a synchronized state.
  • the game progress unit 8105 of the user terminal 800 can operate the character appearing in the game as intended by the creator of the operation instruction data. Specifically, when the operation instruction data includes voice data, the game progress unit 8105 causes the character to speak based on the voice data. Further, when the motion data includes motion data, the game progress unit 8105 moves the character based on the motion data, that is, generates an animation of the character so as to move based on the motion data. do. Needless to say, the structure of the operation instruction data described above is also applicable to the first embodiment. Further, the structure of the operation instruction data described above is merely an example, and the structure is not limited thereto.
  • FIG. 44 is a diagram showing an example of a data structure of user behavior history information.
  • the user action history information includes items such as the action time, action type, and action details in which the user has acted in the live distribution part or the like, and is associated with the user ID that identifies the user 8.
  • the item “action time” is the time information in which the user 8 performed an action in the live distribution part or the like
  • the item “action type” is a type indicating the user's action
  • the item “behavior details” is the action of the user 8. It is the concrete contents of.
  • the consumption of valuable data by the input operation of the user 8 in one example, the input of a tossed item, the billing by purchasing an item, etc.
  • the comment input etc.
  • Actions such as (for example, inputting a comment in a chat part) and changing an item such as a character's clothing (so-called dress-up) may be included.
  • such an action may include selection of a time for later playing back a specific progress portion such as a live distribution part (for example, a recording operation of the specific progress portion).
  • such actions may include the acquisition of rewards, points, etc. in the live distribution part or the like.
  • the user action history information is preferably associated with each other between the data structure of the action instruction data and the data structure of the game information described later in FIG. 45. It should be understood by those skilled in the art that these data structures are merely examples and are not limited thereto.
  • FIG. 45 is a diagram showing an example of the data structure of the game information 132 processed by the system 1500 according to the present embodiment.
  • the items provided in the game information 132 are appropriately determined according to the genre, nature, content, etc. of the game, and the exemplary items do not limit the scope of the present invention.
  • the game information 132 is configured to include each item of "play history”, “item”, “delivery history”, and "game object". Each of these items is appropriately referred to when the game progress unit 8105 advances the game.
  • the play history of the user 8 is stored in the item "play history".
  • the play history is information indicating whether or not the play of the user 8 is completed for each live distribution part or the like stored in the storage unit 8200.
  • the play history may include a list of live stream parts and the like. Statuses such as “played”, “unplayed”, “playable”, and “unplayable” are associated with each live distribution part.
  • the item "item” stores a list of items owned by the user 8 as a game medium.
  • the item may be a clothing item worn by a character.
  • the user 8 can customize the appearance of the character by having the character wear the clothing item obtained while participating in the live distribution part or the like.
  • the item "Distribution history” stores a list of videos, so-called back numbers, that have been live-distributed by the user 5 who is the distributor in the past in the live distribution part or the like.
  • the video that is PUSH-distributed in real time can be viewed only at that time.
  • the moving images for the past distribution are recorded on the server 600, and can be PULL distributed in response to a request from the user terminal 800.
  • the back number may be made available for download by the user 8 for a fee.
  • the item "game object” stores data of various objects such as an avatar object 6, a character object, and an operation object such as a virtual hand operated by the controller 300.
  • FIG. 46 is a flowchart showing an example of the basic game progress of a game executed based on the game program according to the present embodiment.
  • the processing flow is applied to the scenes after the end of the live distribution time, for which the real-time live distribution part and the like have already been completed.
  • step S2001 the input unit 8701 of the user terminal 800 newly requests the progress of the completed live distribution part and the like.
  • the user terminal 800 in response to the request in step S2001, receives recorded operation instruction data related to the completed live distribution part and the like from the server 600.
  • the recorded action instruction data includes motion data and voice data input by the user 5 associated with the character (avatar object 6).
  • the user terminal 800 may receive various progress record data acquired and recorded along with the movement of the character during the progress of the real-time live distribution part or the like.
  • the progress record data may include viewer behavior data in which the user 8 who participated in the real-time live distribution part or the like behaves in accordance with the movement of the character.
  • the viewer behavior data is data including a record of the behavior during the live of all the users 8 (that is, the viewers who participated in the live) who have advanced the real-time live distribution part or the like in real time.
  • the viewer behavior data should include messaging content such as text messages and icons transmitted by the viewer (user 8) to the character in real time during the live performance.
  • messaging content such as text messages and icons transmitted by the viewer (user 8) to the character in real time during the live performance.
  • the recorded operation instruction data and progress record data may be received by the user terminal 800 as separate data, and each may be analyzed (rendered).
  • the server 600 or the character control device 110B may combine pre-recorded operation instruction data and viewer behavior data, and the user terminal 800 may receive the combined data set at one time. By receiving the combined data set, it is possible to reduce the load of subsequent data analysis (rendering) by the user terminal 800.
  • the progress record data is combined with the recorded action order data (that is, the progress record data is included in the recorded action order data).
  • step S2003 the game progress unit 8105 determines whether or not the user has a track record of progressing the live distribution part or the like in real time. The determination may be performed, for example, based on whether there is a record in which the operation instruction data has been transmitted to the user terminal 800. Alternatively, even if the live distribution part or the like is executed based on whether the status is "played" with reference to the item "play history" shown in FIG. 45, the same item "distribution history" is referred to. , It may be executed based on whether or not there is a record of live distribution from the character in the past.
  • the operation instruction data already recorded is stored in the storage unit 8200 of the user terminal 800, it may be determined that the live distribution part or the like has already been advanced in real time. In addition, the determination may be performed by combining them, or by any other method.
  • step S2003 If it is determined in step S2003 that the user 8 has a track record of advancing the live distribution part or the like in real time (YES), the progress of the completed live distribution part or the like is "return distribution”. On the other hand, when it is determined in step S2003 that the user 8 has no record of progressing the live distribution part or the like in real time (NO), the progress of the completed live distribution part or the like is “missed delivery”. As mentioned above, the user experience is different between "return delivery” and "missed delivery”.
  • step S2003 If it is determined in step S2003 that the user 8 has a track record of advancing the live distribution part or the like in real time, the processing flow proceeds from YES in step S2003 to step S2004.
  • step S2004 the game progress unit 8105 acquires and analyzes user behavior history information such as the live distribution part shown in FIG. 44.
  • the user action history information may be acquired from the server 600, or may be used directly when it is already stored in the storage unit 8200 of the user terminal 800.
  • step S2005 the game progress unit 8105 executes the re-progress (that is, the above-mentioned “return distribution”) of the completed live distribution part and the like.
  • the recorded operation instruction data and the user action history information analyzed in step S2004 are used to re-progress the live distribution part or the like.
  • the throwing item input in the real-time live distribution part or the like may be reflected in the operation mode of the character. For example, when the user 8 has acquired the character as a clothing item to be worn on the character (avatar object), the character is operated based on the item (that is, wearing the clothing item).
  • the live distribution part or the like may be re-progressed. That is, the re-progress of the real-time live distribution part, etc. reflects the user behavior history information and the reward information, and is the same as the live distribution part, etc. that has progressed in real time, and is unique to the user. Will be.
  • the live distribution part etc. will be performed again.
  • the user 8 specifies a specific action time by using the data of the "action time" included in the user action history information described with reference to FIG. 44, and the live distribution part or the like is selectively selected from the data.
  • such re-progress corresponds to the record of actions such as consumption of valuable data by the input operation of user 8 and change of items such as character's clothing. It should be feasible based on "time”.
  • the live distribution part or the like is selectively selected by using the data of the action time. Can be advanced to. For example, if the user 8 has selected a period of 2 minutes 45 seconds to 5 minutes 10 seconds from the start of the live distribution part or the like, the user 8 may re-progress the live distribution part or the like over that period. can.
  • step S2003 determines whether the user has no record of advancing the live distribution part or the like in real time. If it is determined in step S2003 that the user has no record of advancing the live distribution part or the like in real time, the processing flow proceeds from NO in step S2003 to step S2006.
  • step S2006 the game progress unit 8105 executes a limited progress (that is, the above-mentioned "missed distribution") such as the completed live distribution part.
  • the reason why the overlooked distribution is restricted is that it can be considered that the user 8 has waived this right even though he / she had the right to receive the live distribution. Therefore, not all of the live distribution is required. Is based on the idea that it is not necessary to reproduce and present to user 8.
  • the progress of the live distribution part etc. is executed using the recorded operation instruction data.
  • the user 8 has acquired a clothing item (for example, a necklace) that can be worn on the character (avatar object 6)
  • the character wears the item in the live distribution part or the like that has progressed in real time.
  • the image was synthesized so that it would work. That is, in the real-time live distribution part or the like, the movement mode of the character was associated with the clothing item. That is, the character could be dressed in clothing items such as a necklace.
  • the movement mode of the character is not associated with the clothing item or the like. That is, the image composition process is not performed so that the character wears the item and operates.
  • the progress of the completed live distribution part and the like is limited in that it does not reflect clothing items and the like and is not unique to the user 8.
  • the thrown money item input in the real-time live distribution part or the like can be reflected in the operation mode of the character.
  • there is no such function in the overlooked distribution (however, the thrown item thrown by another viewer user 8 who was watching the live distribution in real time may be reflected in the operation mode of the character in the overlooked distribution). ..
  • the retrospective distribution viewing real-time live distribution in the past
  • the missed distribution unlike the live distribution part that progressed in real time, it is better to limit the actions of the user 8 that can be accepted.
  • consumption of valuable data by the input operation of the user 8 in one example, throwing money, billing by purchasing an item, etc.
  • the consumption of such valuable data may be restricted so as not to be accepted.
  • a user interface including a button and a screen for executing the consumption of valuable data is displayed on the display unit 8702 (illustrated). do not do).
  • the user 8 can execute the consumption of valuable data through such an input operation in the UI.
  • the overlooked distribution it is preferable that such a UI is hidden so that the input operation by the user 8 cannot be explicitly performed.
  • the user 8 cannot insert a throwing item to support the character.
  • the live distribution part and the like include (but is not limited to) a user participation type event as described in the first embodiment, and the user 8 is provided with an interactive experience with the avatar object 6.
  • a user participation type event As an example of the user participation type event, a rock-paper-scissors game, a bingo game, etc. were explained in the first embodiment, but in addition, a questionnaire provided by a character, a quiz given by an avatar object, etc. may be included. ..
  • the participation result of such a user participation type event is fed back to the user 8 in the overlooked distribution.
  • the result of the correctness determination is fed back to the user 8. (However, if the user 8 who did not participate in the live in real time answered a questionnaire, quiz, etc.
  • the program may automatically make only a simple judgment (correctness judgment, etc.) and give feedback.)
  • the user 8 is participating in the live in the return delivery. If the answer is different from the answer, a display such as "The answer is different from the one during the live" is displayed and output to the user terminal 800 by comparing with the answer of the user who is participating in the live. May be good.
  • the user 8 may be restricted from acquiring predetermined game points for the above feedback. Specifically, as a result of the user 8 participating in the live distribution part or the like progressed in real time, predetermined game points may be associated with the user 8 and added to the points owned by the user 8. On the other hand, in the progress of the completed live distribution part or the like, such a point may not be associated with the user 8. As a result of not adding the points owned by the user 8, for example, in the case of a game in which a plurality of users 8 who are game players are ranked based on the points, the user 8 temporarily advances the completed live distribution part or the like. By the way, it does not affect such a ranking.
  • the user terminal 800 may request the progress of the completed live distribution part or the like again. That is, it is preferable that the return delivery or the missed delivery can be repeatedly executed a plurality of times. In this case, the processing flow returns to step S2001. Further, in the case of the first embodiment, there are three parts, a live game progress part, a live distribution part, and a chat part. Therefore, the return delivery and the missed delivery may be executed through all of these parts, or may be executed only for a part of them.
  • the user 8 can proceed with the live distribution part or the like again in various modes.
  • the user 8 becomes more attached to the character through the experience of interacting with the character (avatar object 6) with a rich sense of reality, so that another part that operates the character is also more interested. Can be played.
  • it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
  • ⁇ Modification 1> whether the progress of the completed live distribution part or the like is a return distribution or a missed distribution is based on whether or not the user 8 has a track record of advancing the live distribution part or the like in real time.
  • the user 8 may be configured to be able to select the return delivery or the overlooked delivery. Alternatively, regardless of the presence or absence of the above achievements, only the overlooked delivery may be provided to the user.
  • ⁇ Modification 2> In the second embodiment, after the end of the return distribution (step S2005 in FIG. 46) or the overlooked distribution (step S2006 in FIG. 46), the progress of the completed live distribution part or the like may be requested again. That is, the return delivery or the missed delivery could be repeatedly executed a plurality of times. In the second modification, it is preferable that the second and subsequent return delivery or missed delivery correspond to the record of the previous return delivery or missed delivery.
  • the first delivery history data is stored in the storage unit 6200 of the server 600. After that, when the recorded operation instruction data related to the completed live distribution part or the like is requested again from the user terminal 800, the first distribution history data is distributed together with the recorded operation instruction data from the server 600. .. In the user terminal 800, the received first delivery history data is referred to, and if the first return delivery or the missed delivery is performed halfway, the user terminal 800 will perform the second return delivery or the second return delivery from the continuation. Resume the progress of overlooked delivery. As a result, the user 8 can efficiently execute the return delivery or the missed delivery.
  • the return delivery should be executed from the second time onward, and if the first delivery is a missed delivery, the missed delivery should be executed from the second time onward. Further, when the recorded operation instruction data already exists in the user terminal 800, the user terminal 800 may not receive the recorded operation instruction data again. As a result, the amount of data received by the user terminal 800 can be saved.
  • Step S2003 in FIG. 46 when it is determined that the user 8 has progressed the live distribution part or the like halfway in real time, it is preferable to restart the progress of the completed live distribution part or the like from the continuation.
  • the record of how far the user 8 has advanced the live distribution part or the like in real time can be determined from the user behavior history information described above in FIG. 44. That is, in the user behavior history information, it is recorded to what time the user 8 has progressed with respect to a specific live distribution part or the like.
  • FIG. 47 shows an example of a screen displayed on the display unit 8702 of the user terminal 800 based on the game program according to the present embodiment, and an example of a transition between these screens.
  • screens include a home screen 8000A, a live selection screen 8000B for live distribution, and a missed selection screen 8000C for missed distribution.
  • the transition from the home screen 8000A to the live selection screen 8000B is possible.
  • the live selection screen 8000B can be changed to the home screen 8000A and the overlooked selection screen 8000C.
  • the actual distribution screen (not shown) is transitioned from the live screen 8000B and the overlooked selection screen 8000C.
  • the home screen 8000A displays various menus for advancing the live distribution part or the like on the display unit 8702 of the user terminal 800.
  • the game progress unit 8105 receives an input operation for starting the game, the game progress unit 8105 first displays the home screen 8000A.
  • the home screen 8000A includes a "live" icon 8020 for transitioning to the live selection screen 8000B.
  • the game progress unit 8105 Upon receiving an input operation for the "live" icon 8020 on the home screen 8000A, the game progress unit 8105 causes the display unit 8702 to display the live selection screen 8000B.
  • the live selection screen 8000B presents the live information that can be distributed to the user 8.
  • the notification information regarding one or more live performances for notifying the user 8 of the live distribution time and the like in advance is displayed in a list.
  • the live announcement information includes at least the live delivery date and time.
  • the live announcement information may include free / paid live information, an advertisement image including an image of a character appearing in the live, and the like.
  • the live selection screen 8000B may display the notification information regarding the live distribution to be distributed in the near future on the live selection screen on the pop-up screen 8060.
  • the server 600 searches for one or more user terminals 800 having the right to receive the live distribution.
  • the right to receive livestreaming is conditioned on the fact that the consideration for receiving livestreaming has been paid (for example, holding a ticket).
  • the corresponding live notification information will be displayed on the user terminal 800 that has the right to receive the live distribution.
  • the user terminal 800 accepts a live playback operation, for example, a selection operation for a live that has reached the live distribution time on the live selection screen 8000B (more specifically, a touch operation for a live image). Accordingly, the game progress unit 8105 shifts the display unit 8702 to the actual distribution screen (not shown). As a result, the user terminal 800 can advance the live distribution part and the like, and advance the live viewing process in real time. When the live viewing process is executed, the game progress unit 8105 operates the character in the live distribution part or the like based on the received operation instruction data. The game progress unit 8105 generates a moving image reproduction screen including a character (avatar object 6) that operates based on the operation instruction data in the live distribution part or the like, and displays it on the display unit 8702.
  • a live playback operation for example, a selection operation for a live that has reached the live distribution time on the live selection screen 8000B (more specifically, a touch operation for a live image).
  • the game progress unit 8105 shifts the display
  • the live selection screen 8000B has a "return (x)" icon 8080 for transitioning to the screen displayed immediately before and a "missing delivery” icon 8100 for transitioning to the missed selection screen 8000C on the display unit 8702. It may be displayed.
  • the game progress unit 8105 shifts the screen 8000B to the home screen 8000A in response to an input operation for the “return (x)” icon 8080 on the live selection screen 8000B.
  • the game progress unit 8105 shifts to the missed selection screen 8000C on the screen 8000B for the input operation for the missed distribution icon 810 on the live selection screen 8000B.
  • the overlook selection screen 8000C displays, among the delivered information related to one or more live delivered in the past, the delivered information in which the user 8 has not progressed the live delivery part or the like in real time.
  • the input unit 8701 of the user terminal 800 receives an input operation for the live distribution information displayed on the overlooked selection screen 8000C, for example, the image 8300 including the character appearing in the live, the game progress unit 8105 is after the live distribution part is completed. , The completed live distribution part, etc. can be advanced again.
  • the delivered information about the live is further delivered with the playback time 812 of each delivered live, the period until the end of delivery (days, etc.) 8140, and how many days before the present. It may include information 8160 indicating whether or not it has been done, a past delivery date and time, and the like.
  • the overlooked selection screen 8000C includes a “back ( ⁇ )” icon 8180 for transitioning to the live selection screen 8000B. In response to the input operation for the "return ( ⁇ )" icon 8180, the game progress unit 8105 transitions to the live selection screen 8000B.
  • the overlooked selection screen 8000C is not limited to this, but it is preferable that the transition is made only from the live selection screen 8000B and not directly from the home screen 8000A.
  • the missed distribution is performed for the user 8 who missed the live distribution, and is only a function accompanying the live distribution function.
  • one of the purposes of this game is to enhance the fun of the game by allowing the user 8 to watch the real-time live distribution, support the character in real time, and deepen the interaction with the character. Therefore, in order to guide the viewer user 8 to watch the live distribution in real time rather than the overlooked distribution in which real-time interaction with the character (delivery side user 5) cannot be performed, here, the overlooked selection is selected from the home screen 8000A. It is better not to be able to directly transition to the screen 8000C.
  • the distributed information that the user 8 who is the viewer has not made the live distribution part etc. progress in real time is displayed.
  • the delivered information about all the live delivered in the past may be displayed in a list for each live.
  • it is preferable that either the return distribution or the overlooked distribution is executed depending on whether or not the user 8 has progressed the live distribution part or the like in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part or the like in real time, the above-mentioned return distribution is performed. On the other hand, if it is determined that the user 8 does not have a track record of advancing the live distribution part or the like in real time, the distribution will be overlooked. As described above with respect to FIG. 46, the look-back delivery and the missed delivery can provide different user experiences.
  • Control block of control unit 2100 (in particular, operation reception unit 2101, display control unit 2102, UI control unit 2103, animation generation unit 2104, game progress unit 2105, virtual space control unit 2106, and reaction processing unit 2107), control unit 6100.
  • Control blocks (particularly, progress support unit 6101 and shared support unit 6102), and control blocks of control unit 8100 (particularly, operation reception unit 8101, display control unit 8102, UI control unit 113, animation generation unit 8104, game progress).
  • the unit 8105, the virtual space control unit 8106, and the progress information generation unit 8107) may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or may be realized by a CPU (Central Processing Unit). It may be realized by software using it.
  • the information processing device including a part or all of the control unit 2100, the control unit 6100, and the control unit 8100 includes a CPU that executes an instruction of a program that is software that realizes each function, the above program, and various data. It is equipped with a ROM (ReadOnlyMemory) or a storage device (these are referred to as "recording media") readable by a computer (or CPU), a RAM (RandomAccessMemory) for expanding the above program, and the like. Then, the object of the present invention is achieved by the computer (or CPU) reading the program from the recording medium and executing the program.
  • a "non-temporary tangible medium" for example, a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program. It should be noted that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the above program is embodied by electronic transmission.
  • the method is a first group of steps executed by a viewing terminal for a viewer to view a real-time distribution performed by a distributor, and the distributor performs real-time distribution.
  • the distributor of the game in one state for each step of displaying the field image showing the field area of the virtual camera on the display unit and the state of the game part included in the game that progresses while transitioning between multiple states.
  • the game processing in each state includes the step of performing the game processing to be advanced based on the movement of the body of the distributor and the operation of the user, and the game processing in each state is the operation of the avatar object generated by the distribution terminal detecting the movement of the body of the distributor.
  • the step of receiving the indicated motion information, the step of updating the field image so that the avatar object includes performing the motion indicated by the motion information in response to the receipt of the motion information, and the motion information are received.
  • the game part In response to that, the step of determining whether or not the movement indicated by the motion information is a motion for transitioning the state, and when the motion indicated by the motion information is a motion for transitioning, the game part A first step group including a step of transitioning the progress from the current state to another state, and a second step group executed by a user terminal of a user other than the viewer who has watched the real-time distribution in real time.
  • the second step group includes, and includes, a step of executing the reproduction of the delivered content realized by the first step group by operating the avatar object based on the motion information.
  • (Item 2) In (Item 1), in the real-time distribution, the operation mode of the avatar object can be associated with the reward acquired during the real-time distribution, and in the reproduction of the delivered content, the operation mode of the avatar object is. Rewards do not have to be associated.
  • (Item 3) In (Item 1) or (Item 2), in the reproduction of the delivered content, the consumption of valuable data by the user's input operation via the operation unit of the user terminal may not be accepted.
  • (Item 4) In any one of (Item 1) to (Item 3), points can be associated with the result of participation in real-time distribution, and points are not associated with the playback of distributed content. May be good.
  • the method is a method executed by a user terminal for the user to watch real-time distribution performed by the distributor, and is a distribution terminal for the distributor to perform real-time distribution.
  • the game in one state is played by the movement of the distributor's body and the state of the distributor.
  • the step of performing the game processing to be advanced based on the user's operation and the game processing in each state are executed, and the distribution terminal detects the movement of the distributor's body and generates the operation information indicating the operation of the avatar object.
  • a step of transitioning from one state to another state and a step of receiving recorded operation information from an external device, wherein the operation information includes motion data and voice data input by a distributor, and operation information. Includes a step of performing playback of the delivered content by operating the avatar object based on.
  • the reproduction of the delivered content is a record of the action instruction data and the action by the user's input operation via the operation unit of the user terminal received while participating in the third real-time distribution. It may be executed based on.
  • the action record includes the time information, and the reproduction of the distributed content is to specify the time information by the user's input operation through the operation unit while participating in the real-time distribution. You may follow it.
  • the action includes the selection of a specific progress part by the user's input operation through the operation unit while participating in the real-time distribution, and the reproduced content is reproduced. In, the progress of only the selected specific progress portion may be executed.
  • the user terminal further includes a display unit, and the display unit transitions from the first screen for displaying a menu related to real-time distribution to the first screen. It is configured to be able to display a second screen for displaying real-time distribution that can be viewed and a third screen for displaying information about the delivered content, and the third screen is transitioned from the second screen.
  • a computer-readable medium storing a computer-executable instruction is displayed on a user terminal when the computer-executable instruction is executed, from (Item 1) to (Item 4), and (Item 11).
  • the second step group of any one of (item 9) and (item 10) according to items 1) to (item 4) is executed.
  • a computer-readable medium storing a computer-executable instruction is displayed on a user terminal when the computer-executable instruction is executed, from (Item 5) to (Item 8), and (Item 12).
  • the step included in the method of any one of (item 9) and (item 10) according to items 5) to (item 8) is executed.
  • the computer system is a computer system including a first information processing device and a second information processing device, and the first information processing device is (item 13).
  • a second program is provided with a memory for storing the first program including the first step group of 1) and a processor for controlling the operation of the first information processing apparatus by executing the first program.
  • the information processing apparatus includes a memory that stores a second program including the second step group of (item 1), and a processor that controls the operation of the second information processing apparatus by executing the second program. , Equipped with. (Item 14)
  • the information processing device includes a memory for storing a program including the step of (item 5), a processor that controls the operation of the information processing device by executing the program, and a processor. To prepare for.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'objet de la présente invention est d'améliorer la sensation d'immersion d'un monde de jeu, et d'améliorer l'attrait du jeu. Le présent procédé comprend : une étape (S2002) dans laquelle un terminal de visualisation pour visualiser une diffusion en temps réel positionne un objet d'avatar et une caméra virtuelle dans un espace virtuel partagé entre un terminal de diffusion et une pluralité de terminaux de visualisation, pendant la diffusion en temps réel, une image de champ visuel représentant une zone de champ visuel d'une caméra virtuelle est affichée sur une unité d'affichage, un traitement de jeu est réalisé dans lequel, pour chaque état de partie de jeu qui progresse tout en effectuant une transition entre une pluralité d'états de jeu, un jeu dans un état progresse sur la base d'opérations d'utilisateur et de mouvements corporels d'un diffuseur, le traitement de jeu dans chaque état consiste à mettre à jour une image de champ visuel de telle sorte que l'objet d'avatar comprend les performances d'un mouvement représenté par des informations d'opérations représentant des opérations de l'objet d'avatar lorsque les informations d'opérations sont reçues, et lorsque le mouvement représenté par les informations d'opérations est un mouvement pour réaliser une transition, la progression de la partie de jeu est amenée à passer à un autre état à partir de l'état actuel, puis, des informations d'opérations enregistrées provenant d'un dispositif externe sont reçues, lesdites informations d'opérations étant exécutées par le terminal utilisateur d'un utilisateur autre qu'un spectateur qui a visualisé la diffusion en temps réel ; et une étape (S2006) dans laquelle, à la suite de l'opération de l'objet d'avatar sur la base des informations d'opérations, le contenu qui a été diffusé est reproduit.
PCT/JP2020/044471 2020-11-30 2020-11-30 Procédé, support lisible par ordinateur et dispositif de traitement d'informations WO2022113335A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022564992A JPWO2022113335A1 (fr) 2020-11-30 2020-11-30
PCT/JP2020/044471 WO2022113335A1 (fr) 2020-11-30 2020-11-30 Procédé, support lisible par ordinateur et dispositif de traitement d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/044471 WO2022113335A1 (fr) 2020-11-30 2020-11-30 Procédé, support lisible par ordinateur et dispositif de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2022113335A1 true WO2022113335A1 (fr) 2022-06-02

Family

ID=81755495

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/044471 WO2022113335A1 (fr) 2020-11-30 2020-11-30 Procédé, support lisible par ordinateur et dispositif de traitement d'informations

Country Status (2)

Country Link
JP (1) JPWO2022113335A1 (fr)
WO (1) WO2022113335A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020049286A (ja) * 2018-05-29 2020-04-02 株式会社コロプラ ゲームプログラム、方法、および情報処理装置
JP2020171005A (ja) * 2019-04-03 2020-10-15 株式会社Zig 配信システム
JP2020179184A (ja) * 2019-01-11 2020-11-05 株式会社コロプラ ゲームプログラム、方法、および情報処理装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020049286A (ja) * 2018-05-29 2020-04-02 株式会社コロプラ ゲームプログラム、方法、および情報処理装置
JP2020179184A (ja) * 2019-01-11 2020-11-05 株式会社コロプラ ゲームプログラム、方法、および情報処理装置
JP2020171005A (ja) * 2019-04-03 2020-10-15 株式会社Zig 配信システム

Also Published As

Publication number Publication date
JPWO2022113335A1 (fr) 2022-06-02

Similar Documents

Publication Publication Date Title
JP6770598B2 (ja) ゲームプログラム、方法、および情報処理装置
JP2021020074A (ja) ゲームプログラム、方法、および情報処理装置
US20240013502A1 (en) Storage medium, method, and information processing apparatus
JP7344948B2 (ja) システム
JP6785325B2 (ja) ゲームプログラム、方法、および情報処理装置
JP6726322B1 (ja) ゲームプログラム、方法、および情報処理装置
JP2020044139A (ja) ゲームプログラム、ゲーム方法、および情報処理装置
JP7305599B2 (ja) プログラム
JP6722320B1 (ja) ゲームプログラム、ゲーム方法、および情報端末装置
WO2022113335A1 (fr) Procédé, support lisible par ordinateur et dispositif de traitement d'informations
WO2022113330A1 (fr) Procédé, support lisible par ordinateur et dispositif de traitement d'informations
JP6722316B1 (ja) 配信プログラム、配信方法、コンピュータ、および視聴端末
JP6826626B2 (ja) 視聴プログラム、視聴方法、および視聴端末
JP2022000218A (ja) プログラム、方法、情報処理装置、およびシステム
JP6754859B1 (ja) プログラム、方法、およびコンピュータ
JP6639561B2 (ja) ゲームプログラム、方法、および情報処理装置
JP2020185476A (ja) プログラム、方法、およびコンピュータ
WO2022137375A1 (fr) Procédé, support lisible par ordinateur et dispositif de traitement d'informations
CN114080260A (zh) 游戏程序、游戏方法以及信息终端装置
JP2021037302A (ja) ゲームプログラム、方法、および情報処理装置
JP2021053182A (ja) プログラム、方法、および配信端末
WO2022137377A1 (fr) Procédé de traitement d'informations, support lisible par ordinateur, système informatique, et dispositif de traitement d'informations
WO2022137376A1 (fr) Procédé, support lisible par ordinateur, et dispositif de traitement d'informations
WO2022113326A1 (fr) Procédé de jeu, support lisible par ordinateur, et dispositif terminal d'information
WO2022137343A1 (fr) Procédé de traitement d'informations, support lisible par ordinateur, et dispositif de traitement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20963599

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022564992

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20963599

Country of ref document: EP

Kind code of ref document: A1