CN116370954A - Game method and game device - Google Patents

Game method and game device Download PDF

Info

Publication number
CN116370954A
CN116370954A CN202310551775.7A CN202310551775A CN116370954A CN 116370954 A CN116370954 A CN 116370954A CN 202310551775 A CN202310551775 A CN 202310551775A CN 116370954 A CN116370954 A CN 116370954A
Authority
CN
China
Prior art keywords
robot
game
initial setting
user
setting information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310551775.7A
Other languages
Chinese (zh)
Other versions
CN116370954B (en
Inventor
程楠
杨健勃
拱伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Keyi Technology Co Ltd
Original Assignee
Beijing Keyi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Keyi Technology Co Ltd filed Critical Beijing Keyi Technology Co Ltd
Priority to CN202310551775.7A priority Critical patent/CN116370954B/en
Publication of CN116370954A publication Critical patent/CN116370954A/en
Application granted granted Critical
Publication of CN116370954B publication Critical patent/CN116370954B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6027Methods for processing data by generating or executing the game program using adaptive systems learning from user actions, e.g. for skill level adjustment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car

Abstract

The application provides a game method and a game device, wherein a robot and a user can participate in a game together, so that better game experience is brought to the user. The game method comprises the following steps: acquiring initial setting information of a game; generating a game scenario for interaction between a user and the robot according to the initial setting information; and sending indication information to the robot according to the game scenario, wherein the indication information is used for indicating the robot to interact with a user based on the initial setting information.

Description

Game method and game device
Technical Field
Embodiments of the present application relate to the field of artificial intelligence technology, and more particularly, to a game method and a game apparatus.
Background
With the continuous development of artificial intelligence technology, robots are increasingly of different types. Among them, a home robot is a relatively common robot that can increase a user's happiness and alleviate a user's stress through interactions with the user. Therefore, how to utilize the interaction between the robot and the user brings better experience to the user and becomes a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a game method and a game device, wherein a robot and a user can participate in a game together, so that better game experience is brought to the user.
In a first aspect, there is provided a gaming method comprising: acquiring initial setting information of a game; generating a game scenario for interaction between a user and the robot according to the initial setting information; and sending indication information to the robot according to the game scenario, wherein the indication information is used for indicating the robot to interact with a user based on the initial setting information.
In the embodiment of the application, the robot and the game can be combined to obtain the initial setting information of the game, the game scenario capable of enabling the user to interact with the robot is generated according to the initial setting information, and the instruction information is sent to the robot to instruct the robot to interact with the user based on the initial setting information, so that the robot and the user can explore the game world together, and immersive game experience is brought to the user.
The initial setting information includes, for example, at least one of: the emotion of the robot, the motion of the robot, the role of the robot, and the interactive content between the user and the robot.
The roles of the robot include, for example, a side, a player in cooperative relationship with the user, or a player in competitive relationship with the user.
In one implementation, the obtaining initial setting information of the game includes: receiving the initial setting information input by a user; alternatively, the initial setting information set in advance in the game device is acquired. That is, the initial setting information of the game may be input by the user at the time of the game, or may be set in the game device in advance.
In one implementation manner, the generating a game scenario for interaction between the user and the robot according to the initial setting information includes: and generating the game scenario based on a generated pre-training language model according to the initial setting information.
In this implementation, the initial setting information is input into the generated pre-training language model by utilizing the powerful data gathering and learning capabilities of the generated pre-training language model, and the generated pre-training language model can generate a game scenario for interaction between the user and the robot according to the initial setting information.
In one implementation, the gaming method further comprises: acquiring historical interaction data between a user and the robot; and training the generated pre-training language model according to the historical interaction data.
The content of each interaction between the user and the robot and the changes of emotion, action and the like of the robot can be recorded, the recorded historical interaction data can be used as training data of a generated pre-training language model, and the generated pre-training language model trained according to the historical interaction data can be better matched with the interaction habit of the user and the robot, so that the game experience of the user is improved.
In one implementation manner, the sending, according to the game scenario, indication information to the robot, where the indication information is used to instruct the robot to interact with a user based on the initial setting information, includes: when information of emotion and/or action of the robot in the initial setting information appears in the game scenario, first indication information is sent to the robot, and the first indication information is used for indicating the robot to execute the emotion and/or action in the initial setting information.
In the implementation manner, the prompting words of the emotion and/or action of the robot in the initial setting information need to be recognized in the progress of the scenario, and when the prompting words of the corresponding emotion and/or action are recognized, first indicating information is sent to the robot so as to indicate the robot to execute the corresponding emotion and/or action.
In one implementation manner, the sending, according to the game scenario, indication information to the robot, where the indication information is used to instruct the robot to interact with a user based on the initial setting information, includes: generating emotion and/or action information of the robot matched with a specific scenario and/or scene in the game scenario according to the promotion of the game scenario; and sending second indication information to the robot, wherein the second indication information is used for indicating the robot to execute emotion and/or action matched with the specific plot and/or scene.
In the implementation manner, in the progress of the scenario, the emotion and/or action of the robot matched with the current scenario or scene can be automatically generated, and the second indication information is sent to the robot so as to indicate the robot to execute the emotion and/or action matched with the current scenario, so that the participation degree of the robot in the game is improved.
In one implementation, the gaming method further comprises: detecting whether the interactive content is completed or not when the interactive content between the robot and the user in the initial setting information appears in the game scenario; and continuing to advance the game scenario when the interactive content is completed.
In this implementation, if the interactive content between the user and the robot appears in the scenario, the game device may determine whether the interactive content is completed, and when the interactive content is completed, continue to advance the game scenario.
In one implementation, the gaming method further comprises: detecting identification information sent by the robot when the interactive content between the user in the initial setting information and the robot appears in the game scenario, wherein the identification information is used for indicating that the robot identifies that the interactive content is completed; and continuing to advance the game scenario when the identification information is detected.
In this implementation manner, if the interactive content between the user and the robot appears in the scenario, the robot may identify whether the interactive content is completed, and when the interactive content is completed, the robot sends identification information to the game device to indicate that the corresponding interactive content is identified as being completed, and when the game device receives the identification information sent by the robot, the game scenario is continuously advanced.
In a second aspect, there is provided a game device comprising: the receiving and transmitting module is used for acquiring initial setting information of the game; the processing module is used for generating a game scenario for interaction between a user and the robot according to the initial setting information; the receiving and transmitting module is further used for sending indication information to the robot according to the game scenario, wherein the indication information is used for indicating the robot to interact with a user based on the initial setting information.
The initial setting information includes, for example, at least one of: the emotion of the robot, the motion of the robot, the role of the robot, and the interactive content between the user and the robot.
The roles of the robot include, for example, a side, a player in cooperative relationship with the user, or a player in competitive relationship with the user.
In one implementation, the transceiver module is specifically configured to: receiving the initial setting information input by a user; alternatively, the initial setting information set in advance in the game device is acquired.
In one implementation, the processing module is specifically configured to generate the game scenario based on a generated pre-training language model according to the initial setting information.
In one implementation, the processing module is further configured to obtain historical interaction data between a user and the robot; and training the generated pre-training language model according to the historical interaction data.
In one implementation manner, the transceiver module is specifically configured to send, when information of emotion and/or action of the robot in the initial setting information appears in the game scenario, first instruction information to the robot, where the first instruction information is used to instruct the robot to execute the emotion and/or action in the initial setting information.
In one implementation, the processing module is further configured to generate information of emotion and/or action of the robot that matches a specific scenario and/or scene in the game scenario according to the advancement of the game scenario; the transceiver module is specifically configured to send second instruction information to the robot, where the second instruction information is used to instruct the robot to execute emotion and/or action matched with the specific scenario and/or scene.
In one implementation, the processing module is further configured to detect, when the interactive content between the robot and the user in the initial setting information appears in the game scenario, whether the interactive content is completed; and continuing to advance the game scenario when the interactive content is completed.
In one implementation, the transceiver module is further configured to: detecting identification information sent by the robot when the interactive content between the user in the initial setting information and the robot appears in the game scenario, wherein the identification information is used for indicating that the robot identifies that the interactive content is completed; the receiving and transmitting module is further used for continuing to advance the game scenario when the receiving and transmitting module detects the identification information.
In a third aspect, there is provided an information sharing platform for providing a plurality of scenarios generated according to the game method described in the first aspect or any implementation manner of the first aspect.
In a fourth aspect, there is provided a gaming device comprising a processor for executing instructions stored in a memory, wherein the instructions, when executed, cause the gaming device to perform a gaming method according to the first aspect or any implementation of the first aspect.
Optionally, the game device further comprises the memory for storing a computer program comprising the instructions.
In a fifth aspect, there is provided a computer readable storage medium comprising computer instructions which, when run on a gaming device, cause the gaming device to perform a gaming method according to the first aspect or any implementation of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of a possible robot according to an embodiment of the present application.
Fig. 2 is a schematic flowchart of a game method provided in an embodiment of the present application.
FIG. 3 is a schematic illustration of interactions between a gaming device, a user, and a robot.
Fig. 4 is a schematic block diagram of a game device provided in an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a game device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the present application will be described below with reference to the accompanying drawings.
In the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in the description of the embodiments of the present application, "plural" or "plurality" means two or more than two.
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
Fig. 1 is a schematic structural diagram of a man-machine interaction device, such as a robot 100, according to an embodiment of the present disclosure.
As shown in fig. 1, the robot 100 includes a processor 110, an actuator 111, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna, a wireless communication module 150, a sensor module 160, an audio module 170, a speaker 170A, a microphone 170B, a camera 180, a display screen 190, and the like.
The processor 110 includes, for example, a graphics processor (graphics processing unit, GPU), a controller, memory, and the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors. Wherein the controller may be a neural hub and command center of the robot 100. The controller can generate operation control signals according to instruction operation codes, time sequence signals and the like to finish the control of instruction fetching and instruction execution.
The memory is used for storing instructions and data. The memory in the processor 110 may be, for example, a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include at least one interface. The interface may include one or more of an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a USB interface, and the like.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is merely an example, and does not limit the structure of the robot 100. In other embodiments, the robot 100 may also use different interfaces in the above embodiments, or a combination of interfaces.
The actuator 111 is used to control movement, rotation, jumping, etc. of the robot 100. Optionally, in some embodiments, if the robot 100 includes a head, a torso, and legs, the actuator 111 is further configured to control the rotation of the torso relative to the legs, the rotation of the legs relative to the torso, the rocking of the torso, the rotation of the head along the torso, or the like. In some embodiments, the actuator 111 may include at least one motor.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capabilities of the robot 100.
The internal memory 121 is used to store computer executable program code that includes instructions. The processor 110 performs various functional applications and data processing of the robot 100 by executing instructions stored in the internal memory 121. The internal memory 121 includes a stored program area and a stored data area. Wherein the storage program area is used for storing an operating system, application programs such as a sound playing function, an image playing function, etc. required for at least one function. The storage data area is used to store data created during use of the robot 100, such as audio data, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The USB interface 130 is an interface conforming to the USB standard specification, for example, a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like. The USB interface 130 may be used to connect a charger to charge the robot 100, and the machine may also be used to transfer data between the robot 100 and peripheral devices.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the robot 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110.
The wireless communication module 150 may provide solutions for wireless communication applied on the robot 100, such as a wireless local area network (wireless local area networks, WLAN), a wireless fidelity (wireless fidelity, wi-Fi) network, a Bluetooth (BT) network, etc.
In some embodiments, the antenna of the robot 100 and the wireless communication module 150 are coupled such that the robot 100 may communicate with a network and other devices via wireless communication techniques.
The sensor module 160 may include at least one sensor. For example, the sensor module 160 includes a touch sensor, a distance sensor, a gesture sensor, and the like. In some embodiments, the touch sensor is a capacitive sensor, and may be disposed at a top of the head, neck, back, abdomen, etc. of the robot 100 for sensing user interaction such as stroking, tapping, etc. The distance sensor is used to measure the distance between the robot 100 and an external environmental object or user. The attitude sensor is a gyroscope for sensing the attitude change of the robot 100.
The audio module 170 is used to convert digital audio information to an analog audio signal output and/or to convert an analog audio input to a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110. Speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. Microphone 170B is also referred to as a "microphone" or "microphone" for converting sound signals into electrical signals.
The robot 100 may implement audio functions such as voice playback, recording, etc. through the audio module 170, speaker 170A, microphone 170B, and processor 110, etc.
The camera 180 is used for capturing a still image or video, so that the processor 110 can detect an event according to the image or video acquired by the camera 180, so that the event can be fed back, and the shooting direction of the camera 180 can be consistent with the direction facing the front surface of the robot 100, thereby realizing that the robot 100 simulates the environment where the 'human eyes see'. The camera 180 may store the acquired image in the memory 121, and may also directly transmit the acquired image to the processor 110.
The display screen 190 is used to display information input by a user or to provide the user with information and various menu functions of the robot 100. The display screen 190 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. Further, a display screen 190 for displaying a simulated eye, which may include a simulated pupil and ground color portion (iris sclera), is included in the head region of the robot 100. It should be noted that the simulated eyes in the display screen 190 may include a left eye and a right eye, or may include only one eye, and may move on the display screen 190 to look in different directions and positions. It should be appreciated that during movement, both eyes move simultaneously.
In the application embodiment, the robot 100 may be a robot having a simulated humanoid form, or a robot having a non-humanoid form such as a robot simulating an animal form, or a robot of a non-biological form. That is, the robot 100 is a person and a device having a movement function including movement, rotation, and the like.
In general, when a user plays a game, particularly an open world game, the open world game is basically run on a WEB page (WEB) side, and a robot cannot participate in the game to accompany the user to search the game world, so that the game experience of the user is poor.
Therefore, the embodiment of the application provides a game scheme, wherein the robot is combined with the game, the game scenario which can enable the user to interact with the robot is generated according to the initial setting information of the game, the robot and the user explore the game world together, and better game experience is brought to the user.
Fig. 2 shows a schematic flow chart of a game method 200 provided in an embodiment of the present application. The gaming method described in fig. 2 may be performed by a gaming device, which may be, for example, a gaming Application (APP). The game device may be provided in an electronic apparatus such as a mobile phone or a tablet computer, or may be provided in a robot. Information transmission can be performed between the electronic equipment and the robot based on a wireless communication mode. The robot is, for example, a companion robot, and the structure thereof is, for example, as shown in fig. 1. As shown in fig. 2, the gaming method 200 includes some or all of the following steps.
In step 210, initial setting information of a game is acquired.
In step 220, a game scenario for interaction between the user and the robot is generated according to the initial setting information.
In step 230, according to the game scenario, instruction information is sent to the robot, where the instruction information is used to instruct the robot to interact with the user based on the initial setting information.
In the embodiment of the application, the robot and the game can be combined, initial setting information of the game is obtained, a game scenario capable of enabling the user to interact with the robot is generated according to the initial setting information, and indication information is sent to the robot so as to indicate the robot to interact with the user based on the initial setting information. Thus, the robot and the user can explore the game world together, and an immersive game experience is brought to the user.
The initial setting information may include interaction information for interaction between the robot and the user, such as information related to the state of the robot, information of interaction contents between the user and the robot, and other game information, etc.
For example, the initial setting information includes at least one of: the emotion of the robot, the motion of the robot, the role of the robot, and the interactive content between the user and the robot.
The emotion of the robot includes, for example, qi, happiness, difficulty, fear, and the like.
The movements of the robot include, for example, moving ears, moving tails, tilting brains, lifting legs, rotating, and the like.
The roles of the robot include, for example, a side, a player in cooperative relationship with the user, or a player in competing relationship with the user. When the robot is on the side and the user is a player, the game is a single scenario game. When both the robot and the user are players, the game is a two-person or multi-person scenario game in which the robot and the user participate together. At this time, the game device plays the scenario, and the user cooperates with the robot to complete the scenario game; or the robot plays a counterproductive role, the user needs to defeat the robot to complete the game.
The interactive content between the user and the robot can comprise interactive content such as limb interaction, language interaction and the like. For example, the user breaks the robot's ear, strokes or calms the robot, takes a picture, puts out a predetermined gesture (post), the user dialogues with the robot, etc.
In step 210, the game device may receive initial setting information input by a user; alternatively, the initial setting information set in advance in the game device is acquired. That is, the initial setting information of the game may be input by the user at the time of the game, or may be set in the game device in advance.
The game device acquires the initial setting information such as emotion, action, role, interaction content and the like of the robot, and generates a game scenario according to the initial setting information.
In one implementation, in step 220, the gaming device may generate a game scenario based on the generated pre-trained language model.
The generative pre-training language model is a language processing model based on deep learning technology, for example, a generative pre-training transformer (GPT) developed by Open AI, such as GPT-4 or chat GPT, lama or PaLM-E developed by google, a text-to-text developed by hundred degrees, a book 2.5 developed by business soup technology, and the like. The generated pre-training language model has language understanding and text generating capabilities, and particularly, the model can be trained by connecting a large number of corpuses, wherein the corpuses comprise dialogs in the real world, so that the model has the capabilities of learning astronomy, learning geography downwards and interacting according to the context of chatting, and can communicate with an almost indistinct interaction scene of a real human.
Therefore, the initial setting information is input into the generated pre-training language model by utilizing the strong data gathering and learning capability of the generated pre-training language model, and the generated pre-training language model can generate the game scenario for the interaction between the user and the robot according to the initial setting information.
In one implementation, the gaming method 200 further includes: acquiring historical interaction data between a user and a robot; and training the generated pre-training language model according to the historical interaction data.
The robots cultivated by different users have different interaction habits, the content of each interaction between the users and the robots and the changes of emotion, action and the like of the robots can be recorded, and the recorded historical interaction data can be used as training data of a generated pre-training language model. The generated pre-training language model trained according to the historical interaction data can better match the interaction habit of the user and the robot, so that personalized interaction is realized.
After generating the game scenario for the interaction between the user and the robot, the game device may output the game scenario. For example, the game scenario is presented by means of voice, pictures or video.
When the game apparatus is provided in an electronic device such as a mobile phone, the game scenario can be presented through a display screen or a speaker of the electronic device. When the game device is provided in the robot, the game scenario may be presented through a display screen, a speaker, or the like of the robot.
In one implementation, in step 230, when information of emotion and/or action of the robot in the initial setting information appears in the game scenario, first instruction information for instructing the robot to perform the emotion and/or action in the initial setting information is transmitted to the robot.
Accordingly, the robot receives the first indication information and executes emotion and/or action in the initial setting information according to the first indication information.
In the progress of the scenario, the prompt words of the emotion and/or action of the robot in the initial setting information need to be recognized, and when the prompt words of the corresponding emotion and/or action are recognized, first indication information is sent to the robot so as to indicate the robot to execute the corresponding emotion and/or action. After receiving the first indication information, the robot presents corresponding emotion and/or executes corresponding actions.
In another implementation, in step 230, information of emotion and/or action of the robot that matches a specific scenario and/or scene in the game scenario may be generated according to the advancement of the game scenario; and, second indication information for instructing the robot to perform emotion and/or action matching with a specific scenario and/or scene is transmitted to the robot.
Accordingly, the robot receives the second indication information and performs emotion and/or action matching with the specific plot and/or scene according to the second indication information.
During the progress of the scenario, the game device may also automatically generate the emotion and/or action of the robot that matches the current scenario and/or scenario, e.g., automatically parse the game scenario by the generated pre-trained language model and generate the emotion and/or action of the robot that matches the current scenario and/or scenario. The game device sends second indication information to the robot to indicate the robot to execute emotion and/or action matched with the current plot and/or scene, so that the participation degree of the robot in the game is improved. For example, a favorite toy or thing of the robot appears in a certain scene of a game scenario, which may indicate that the robot shows a happy emotion or that the robot performs a tail-shaking action.
It is understood that only episodes matching the initial setting information may be included in the game scenario; alternatively, the game scenario may include, in addition to the scenario matching the initial setting information, the emotion and/or action of the robot matching the specific scenario and/or scene, which is automatically parsed and generated by the generated pre-trained language model.
In one implementation, the gaming method 200 further includes: and detecting whether the interactive content is completed or not when the interactive content between the user and the robot appears in the scenario, and continuing to advance the game scenario when the interactive content is completed.
That is, if the interactive content between the user and the robot appears in the scenario, the game device may determine whether the interactive content is completed, and when the interactive content is completed, the game scenario is continuously advanced.
For example, if a certain level requires a robot to make a predetermined gesture to pass a gate, the game device needs to detect whether the robot has made the gesture, for example, whether the interactive contents are completed can be determined according to the gesture of the robot in the photograph taken by the user. And continuing to advance the game scenario when the interactive content is determined to be completed.
In another implementation, the gaming method 200 further includes: when the interactive content between the user and the robot in the initial setting information appears in the game scenario, detecting the identification information sent by the robot, and continuing to advance the game scenario when the identification information is detected. The identification information is used for indicating that the robot identifies that the interactive content is completed.
Accordingly, the robot needs to recognize whether the interactive contents are completed or not, and transmit the recognition information to the game device after recognizing that the interactive contents are completed.
That is, if the interactive contents between the user and the robot occur in the scenario, the robot can identify whether the interactive contents are completed, and when the interactive contents are completed, the robot transmits identification information to the game device to indicate that the corresponding interactive contents are identified to be completed, and the game device continues to advance the game scenario when receiving the identification information transmitted by the robot.
For example, in the progress of a game scenario, a certain scenario requires a user to feel the robot, and when the robot detects that the robot is stroked, the robot can inform the game device that the interactive content is completed by sending identification information to the game device, so that the game device can continue to progress the scenario.
Optionally, when the interactive content between the user and the robot appears in the game scenario, the game device sends third indication information to the robot, wherein the third indication information carries specific interactive content which needs to be identified by the robot.
As an example, as shown in fig. 3, assuming that a game apparatus is provided to an electronic device such as a mobile phone, a game in the game apparatus may be jointly participated by three parties, that is, the game apparatus 300, the user 500, and the robot 100, and initial setting information of the game may be input by the user 500 or may be preset in the game apparatus 300. The game device 300 reads the initial setting information, for example, including: mood cue words of the robot 100 such as happy, afraid, etc.; action prompt words of the robot, such as moving ears, moving tails, lifting legs and the like; and a prompt for the interactive contents between the user 500 and the robot 100, for example, the user 500 breaks the ears of the robot 100, the user 500 touches the robot 100, the user 500 photographs the robot 100, and the like.
Based on these cue words, the game device 300 generates a suitable game scenario based on the generated pre-training language model, and based on the emotion keywords such as gas, happiness, fear, etc., the action cue words such as ear movement and tail movement, and the interactive content cue words such as ear movement, touch movement, photographing, etc. In the progress of the game scenario, a scenario or scene matching these presentation words appears, and at this time, game device 300 transmits instruction information to robot 100, which carries emotion required to be expressed by robot 100 and/or action required to be executed by robot 100. Upon receiving the instruction information, the robot 100 suitably presents a feeling such as a feeling of being angry, happy, or afraid, and performs an action such as ear movement, tail movement, and leg lifting when these events and scenes occur.
Further, as the game scenario advances, it may be necessary for the user 500 and the robot 100 to execute the interactive contents in the initial setting information at a certain level to pass through the level. For example, the robot 100 may need to swing out the leg-lifting operation to pass the checkpoint. After the robot 100 swings out of the leg-lifting action, the user 500 may take a picture of the robot 100 using a mobile phone, and after the game device 300 in the mobile phone receives the picture, identify whether the robot 100 swings out of the leg-lifting action in the picture, if the robot 100 swings out of the leg-lifting action in the picture, the game device 300 determines that the level passes, and continues to output the subsequent game scenario. Alternatively, the robot 100 may send identification information to the game device 300 after the leg is lifted, so as to inform the game device 300 that the leg lifting task is completed, and after receiving the identification information, the game device 300 determines that the gate passes and continues to output the subsequent game scenario. In this way, since the game device 300 can generate a game scenario according to the initial setting information, the robot 100 and the user 500 participate in the game together, and the game experience of the user is greatly improved.
The game device 300 may interact with the robot through a wireless communication module of the electronic device, such as bluetooth or WIFI. The game device 300 is connected to an image pickup module, a voice module, and the like of the electronic apparatus through a wire to acquire a photograph taken by a user, voice of the user or the robot, and the like.
Fig. 3 is an example of a game device provided in an electronic device such as a mobile phone, and in practical application, the game device may be provided in a robot. That is, a game scenario is generated and output by the robot, and interacts with the user in the game scenario.
The present application also provides a game device, as shown in fig. 4, the game device 300 includes a transceiver module 310 and a processing module 320. The transceiver module 310 is configured to obtain initial setting information of a game; the processing module 320 is configured to generate a game scenario for interaction between the user and the robot according to the initial setting information; the transceiver module 310 is further configured to send, according to the game scenario, indication information to the robot, where the indication information is used to instruct the robot to interact with the user based on the initial setting information.
The initial setting information includes, for example, at least one of: the emotion of the robot, the motion of the robot, the role of the robot, and the interactive content between the user and the robot.
The roles of the robot include, for example, a side, a player in cooperative relationship with the user, or a player in competitive relationship with the user.
In one implementation, the transceiver module 310 is specifically configured to: receiving the initial setting information input by a user; alternatively, the initial setting information set in advance in the game device is acquired.
In one implementation, the processing module 320 is specifically configured to generate the game scenario based on a generated pre-training language model according to the initial setting information.
In one implementation, the processing module 320 is further configured to obtain historical interaction data between a user and the robot; and training the generated pre-training language model according to the historical interaction data.
In one implementation, the transceiver module 310 is specifically configured to send, when information of emotion and/or action of the robot in the initial setting information appears in the game scenario, first instruction information to the robot, where the first instruction information is used to instruct the robot to execute the emotion and/or action in the initial setting information.
In one implementation, the processing module 320 is further configured to generate information of emotion and/or action of the robot that matches a specific scenario and/or scene in the game scenario according to the advancement of the game scenario; the transceiver module 310 is specifically configured to send second instruction information to the robot, where the second instruction information is used to instruct the robot to perform emotion and/or action matched with the specific scenario and/or scene.
In one implementation, the processing module 320 is further configured to detect, when the interactive content between the robot and the user in the initial setting information appears in the game scenario, whether the interactive content is completed; and continuing to advance the game scenario when the interactive content is completed.
In one implementation, the transceiver module 310 is specifically configured to: and detecting identification information sent by the robot, wherein the identification information is used for indicating the robot to identify that the interactive content is completed.
The present application also provides an information sharing platform for providing a plurality of scenarios generated according to the game method 200 described in any of the above embodiments.
The information sharing platform provides a plurality of game scenarios generated based on the game method 200, and a user can select the game scenario required by the user on the platform, interact with the robot in the game, and jointly complete the game with the user by the robot. Each user can upload game scenario generated based on the initial setting information set by the user on the platform, and download the game scenario shared by other users.
The present application also provides a gaming device, as shown in fig. 5, where gaming device 400 includes a processor 410, where processor 410 is configured to execute instructions stored in a memory, where the instructions, when executed, cause gaming device 400 to perform gaming method 200 described in any of the embodiments above. Optionally, the gaming device 400 further comprises a memory 420, the memory 420 being for storing a computer program comprising instructions.
It should be appreciated that specific details of the gaming apparatus 300 and gaming device 400 may be referred to the foregoing description of the gaming method 200 and are not repeated here for brevity.
The gaming device 400 may, for example, comprise a chip comprising a processor and interface circuitry for providing program instructions or data to the processor for execution of the program instructions to implement the gaming method 200 described in any of the embodiments above. The principle and technical effects of the method are similar to those of the game method 200, and are not repeated here.
The present application also provides an information sharing platform for providing a plurality of scenarios generated according to the game method 200 described in any of the above embodiments.
The information sharing platform provides a plurality of game scenarios generated based on the game method 200, and a user can select the game scenario required by the user on the platform, interact with the robot in the game, and jointly complete the game with the user by the robot. Each user can upload game scenario generated based on the initial setting information set by the user on the platform, and download the game scenario shared by other users.
The present application also provides a computer readable storage medium comprising computer instructions which, when executed on a gaming device, cause the gaming device to perform the gaming method 200 described in any of the embodiments above. The principle and technical effects of the method are similar to those of the game method 200, and are not repeated here.
The present application also provides a computer program product comprising computer readable code, or a non-transitory computer readable storage medium carrying computer readable code, which, when executed in an electronic device, causes a processor in the electronic device to perform the gaming method described in any of the embodiments above. The principle and technical effects of the method are similar to those of the game method 200, and are not repeated here.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (23)

1. A game method, wherein the game method is executed by a game device, the game method comprising:
acquiring initial setting information of a game;
generating a game scenario for interaction between a user and the robot according to the initial setting information;
and sending indication information to the robot according to the game scenario, wherein the indication information is used for indicating the robot to interact with a user based on the initial setting information.
2. The game method according to claim 1, wherein the acquiring initial setting information of the game includes:
receiving the initial setting information input by a user; or alternatively, the process may be performed,
the initial setting information preset in the game device is acquired.
3. The game method according to claim 1, wherein the initial setting information includes at least one of the following information:
the emotion of the robot, the motion of the robot, the role of the robot, and the interactive content between the user and the robot.
4. A gaming method according to claim 3, wherein the robot character comprises: a player who is on a side, in a cooperative relationship with the user, or in a competing relationship with the user.
5. The game method according to any one of claims 1 to 4, wherein generating a game scenario for interaction between a user and a robot according to the initial setting information includes:
and generating the game scenario based on a generated pre-training language model according to the initial setting information.
6. A gaming method according to claim 5, wherein the gaming method further comprises:
acquiring historical interaction data between a user and the robot;
and training the generated pre-training language model according to the historical interaction data.
7. The game method according to any one of claims 1 to 4, wherein the sending, to the robot, instruction information for instructing the robot to interact with a user based on the initial setting information according to the game scenario, includes:
when information of emotion and/or action of the robot in the initial setting information appears in the game scenario, first indication information is sent to the robot, and the first indication information is used for indicating the robot to execute the emotion and/or action in the initial setting information.
8. The game method according to any one of claims 1 to 4, wherein the sending, to the robot, instruction information for instructing the robot to interact with a user based on the initial setting information according to the game scenario, includes:
generating emotion and/or action information of the robot matched with a specific scenario and/or scene in the game scenario according to the promotion of the game scenario;
and sending second indication information to the robot, wherein the second indication information is used for indicating the robot to execute emotion and/or action matched with the specific plot and/or scene.
9. A gaming method according to any one of claims 1 to 4, wherein the gaming method further comprises:
detecting whether the interactive content is completed or not when the interactive content between the robot and the user in the initial setting information appears in the game scenario;
and continuing to advance the game scenario when the interactive content is completed.
10. The method of claim 9, wherein the detecting whether the interactive content is completed comprises:
And detecting identification information sent by the robot, wherein the identification information is used for indicating the robot to identify that the interactive content is completed.
11. A game device, the game device comprising:
the receiving and transmitting module is used for acquiring initial setting information of the game;
the processing module is used for generating a game scenario for interaction between a user and the robot according to the initial setting information;
the receiving and transmitting module is further used for sending indication information to the robot according to the game scenario, wherein the indication information is used for indicating the robot to interact with a user based on the initial setting information.
12. The game apparatus of claim 11, wherein the transceiver module is specifically configured to:
receiving the initial setting information input by a user; or alternatively, the process may be performed,
the initial setting information preset in the game device is acquired.
13. The game apparatus according to claim 11, wherein the initial setting information includes at least one of the following information:
the emotion of the robot, the motion of the robot, the role of the robot, and the interactive content between the user and the robot.
14. The game apparatus of claim 13, wherein the robot character comprises: a player who is on a side, in a cooperative relationship with the user, or in a competing relationship with the user.
15. Gaming apparatus as defined in any one of claims 11-14, wherein said processing module is configured,
and generating the game scenario based on a generated pre-training language model according to the initial setting information.
16. The gaming apparatus of claim 15, wherein the processing module is further configured to,
acquiring historical interaction data between a user and the robot;
and training the generated pre-training language model according to the historical interaction data.
17. The gaming apparatus of any of claims 11-14, wherein said transceiver module is configured,
when information of emotion and/or action of the robot in the initial setting information appears in the game scenario, first indication information is sent to the robot, and the first indication information is used for indicating the robot to execute the emotion and/or action in the initial setting information.
18. The gaming apparatus of any of claims 11-14, wherein said processing module is further configured to,
generating emotion and/or action information of the robot matched with a specific scenario and/or scene in the game scenario according to the promotion of the game scenario;
the transceiver module is specifically configured to send second instruction information to the robot, where the second instruction information is used to instruct the robot to execute emotion and/or action matched with the specific scenario and/or scene.
19. The gaming apparatus of any of claims 11-14, wherein said processing module is further configured to,
detecting whether the interactive content is completed or not when the interactive content between the robot and the user in the initial setting information appears in the game scenario;
and continuing to advance the game scenario when the interactive content is completed.
20. The gaming apparatus of claim 19, wherein the transceiver module is further configured to:
and detecting identification information sent by the robot, wherein the identification information is used for indicating the robot to identify that the interactive content is completed.
21. A platform for information sharing, for providing a plurality of game scenarios generated according to the game method of any one of claims 1 to 10.
22. A gaming device, wherein said gaming device comprises a processor for executing instructions stored in a memory to cause said gaming device to perform a gaming method according to any of claims 1 to 10.
23. A computer readable storage medium comprising computer instructions which, when run on a gaming device, cause the gaming device to perform the gaming method of any of claims 1 to 10.
CN202310551775.7A 2023-05-16 2023-05-16 Game method and game device Active CN116370954B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310551775.7A CN116370954B (en) 2023-05-16 2023-05-16 Game method and game device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310551775.7A CN116370954B (en) 2023-05-16 2023-05-16 Game method and game device

Publications (2)

Publication Number Publication Date
CN116370954A true CN116370954A (en) 2023-07-04
CN116370954B CN116370954B (en) 2023-09-05

Family

ID=86963593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310551775.7A Active CN116370954B (en) 2023-05-16 2023-05-16 Game method and game device

Country Status (1)

Country Link
CN (1) CN116370954B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190193273A1 (en) * 2016-08-31 2019-06-27 Taechyon Robotics Corporation Robots for interactive comedy and companionship
CN112060080A (en) * 2020-07-31 2020-12-11 深圳市优必选科技股份有限公司 Robot control method and device, terminal equipment and storage medium
CN112668687A (en) * 2020-12-01 2021-04-16 达闼机器人有限公司 Cloud robot system, cloud server, robot control module and robot
CN113332725A (en) * 2021-06-29 2021-09-03 北京中清龙图网络技术有限公司 Game scenario deduction method and device, electronic equipment and storage medium
CN115212580A (en) * 2022-09-21 2022-10-21 深圳市人马互动科技有限公司 Method and related device for updating game data based on telephone interaction

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190193273A1 (en) * 2016-08-31 2019-06-27 Taechyon Robotics Corporation Robots for interactive comedy and companionship
CN112060080A (en) * 2020-07-31 2020-12-11 深圳市优必选科技股份有限公司 Robot control method and device, terminal equipment and storage medium
CN112668687A (en) * 2020-12-01 2021-04-16 达闼机器人有限公司 Cloud robot system, cloud server, robot control module and robot
CN113332725A (en) * 2021-06-29 2021-09-03 北京中清龙图网络技术有限公司 Game scenario deduction method and device, electronic equipment and storage medium
CN115212580A (en) * 2022-09-21 2022-10-21 深圳市人马互动科技有限公司 Method and related device for updating game data based on telephone interaction

Also Published As

Publication number Publication date
CN116370954B (en) 2023-09-05

Similar Documents

Publication Publication Date Title
US20210295099A1 (en) Model training method and apparatus, storage medium, and device
US10445917B2 (en) Method for communication via virtual space, non-transitory computer readable medium for storing instructions for executing the method on a computer, and information processing system for executing the method
EP4109408A1 (en) Electronic device for generating image including 3d avatar reflecting face motion through 3d avatar corresponding to face and method of operating same
JP7254772B2 (en) Methods and devices for robot interaction
US10438394B2 (en) Information processing method, virtual space delivering system and apparatus therefor
CN111586318A (en) Electronic device for providing virtual character-based photographing mode and operating method thereof
CN110308792B (en) Virtual character control method, device, equipment and readable storage medium
EP3550812B1 (en) Electronic device and method for delivering message by same
WO2021244457A1 (en) Video generation method and related apparatus
US20180376069A1 (en) Erroneous operation-preventable robot, robot control method, and recording medium
CN106951881A (en) A kind of three-dimensional scenic rendering method, apparatus and system
KR20200092207A (en) Electronic device and method for providing graphic object corresponding to emotion information thereof
WO2019123744A1 (en) Information processing device, information processing method, and program
JP2024050757A (en) Photographing system, photographing method, photographing program, and stuffed toy
CN113205569A (en) Image drawing method and device, computer readable medium and electronic device
CN116370954B (en) Game method and game device
CN105797376A (en) Method and terminal for controlling role model behavior according to expression of user
WO2021036839A1 (en) Camera control method and apparatus, and terminal device
CN114712862A (en) Virtual pet interaction method, electronic device and computer-readable storage medium
CN112742024B (en) Virtual object control method, device, equipment and storage medium
CN113325948B (en) Air-isolated gesture adjusting method and terminal
CN113176827A (en) AR interaction method and system based on expressions, electronic device and storage medium
CN114025854A (en) Program, method, and terminal device
JP2018097879A (en) Method for communicating via virtual space, program for causing computer to execute method, and information processing apparatus for executing program
US20240105173A1 (en) Method and apparatus for providing virtual space in which interaction with another entity is applied to entity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant