CN112717384A - Information processing method and device in game, electronic equipment and storage medium - Google Patents

Information processing method and device in game, electronic equipment and storage medium Download PDF

Info

Publication number
CN112717384A
CN112717384A CN202110025526.5A CN202110025526A CN112717384A CN 112717384 A CN112717384 A CN 112717384A CN 202110025526 A CN202110025526 A CN 202110025526A CN 112717384 A CN112717384 A CN 112717384A
Authority
CN
China
Prior art keywords
virtual
information
game
prompt information
prompt
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110025526.5A
Other languages
Chinese (zh)
Inventor
胡志鹏
王翌希
卜佳俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Netease Hangzhou Network Co Ltd
Original Assignee
Zhejiang University ZJU
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU, Netease Hangzhou Network Co Ltd filed Critical Zhejiang University ZJU
Priority to CN202110025526.5A priority Critical patent/CN112717384A/en
Publication of CN112717384A publication Critical patent/CN112717384A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The application provides an information processing method and device in a game, electronic equipment and a storage medium, and relates to the technical field of games. The method comprises the following steps: responding to a detection instruction, and transmitting a virtual detection object into a game scene; when the virtual object of detecting touches target object in the recreation scene, output first tip information, first tip information is used for the suggestion virtual object of detecting touches target object, and first tip information includes: voice information, and/or vibration information. The method can prevent the player from being influenced by the obstacle in the process of fierce combat by transmitting the detector to detect the target object in the game scene in advance, accurately position the attack target and improve the game experience of the player with visual impairment.

Description

Information processing method and device in game, electronic equipment and storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to an information processing method and apparatus in a game, an electronic device, and a storage medium.
Background
With the continuous sophistication of game development technology, the game makes great progress in sound effect convenience, so that the visually-impaired players can listen to sound and distinguish by wearing earphones to participate in fierce game battles. However, the visually impaired player is hindered by the obstacle because it is difficult to recognize the position of the obstacle during the game due to the problem of eyesight.
In the prior art, a visual disturbance player sends out prompt information after touching a barrier in a game process so as to prompt the player to avoid the barrier.
However, the obstacle avoidance implemented as described above has a delay in presentation information, and thus the efficiency of the player's battle is affected, resulting in poor game experience for the player.
Disclosure of Invention
An object of the present application is to provide an information processing method, an information processing apparatus, an electronic device, and a storage medium in a game, so as to solve the problems that the prompt information for avoiding obstacles in the prior art is delayed, the fighting efficiency of a player is affected, and the game experience of the player is poor.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides an in-game information processing method, where a game scene of a target game is at least partially displayed on a graphical user interface of a terminal, the method including:
responding to a detection instruction, and transmitting a virtual detection object into a game scene;
when the virtual detection object touches a target object in the game scene, outputting first prompt information, wherein the first prompt information is used for prompting the virtual detection object to touch the target object, and the first prompt information comprises: voice information, and/or vibration information.
Optionally, the transmitting a virtual probe into a game scene includes:
and transmitting the virtual probe into the game scene according to the current aiming direction.
Optionally, the virtual probe does not affect a game attribute of the target object.
Optionally, the method further comprises:
and responding to an attack instruction, and releasing the attack skill into the game scene.
Optionally, the method further comprises:
outputting second prompt information when the virtual probe bounces to a virtual object controlled by a player through the target object, wherein the second prompt information is used for prompting the virtual probe to bounce to the virtual object through the target object, and the second prompt information comprises: voice information, and/or vibration information.
Optionally, the transmitting a virtual probe into the game scene in response to the probe instruction includes:
determining the current aiming direction in response to a detection instruction;
and emitting the virtual probe object along the current aiming direction according to a preset time interval.
Optionally, the virtual probe comprises: a particle object; the emitting the virtual probe along the current aiming direction at preset time intervals comprises:
emitting the particle object along the current aiming direction at preset time intervals.
Optionally, the virtual probe comprises: a virtual ray; the emitting the virtual probe along the current aiming direction at preset time intervals comprises:
and emitting the virtual ray along the current aiming direction according to a preset time interval.
Optionally, the outputting, when the virtual object touches a target object in the game scene, first prompt information includes:
according to the current aiming direction, determining the output intensity of the first prompt message as a preset intensity corresponding to the current aiming direction, wherein the output intensity comprises: volume or frequency;
and outputting the first prompt message according to the output intensity.
Optionally, the method further comprises:
in response to a detection instruction, emitting a virtual probe to a selectable object in the game scene, the selectable object being an object in a direction other than the current aiming direction;
when the virtual detection object touches a selectable object in the game scene, outputting third prompt information, wherein the third prompt information is used for prompting the virtual detection object to touch the selectable object, and the third prompt information comprises: voice information, and/or vibration information.
Optionally, the method further comprises:
outputting fourth prompt information when the virtual probe bounces to the virtual object via the selectable object, where the fourth prompt information is used to prompt the virtual probe to bounce to the virtual object via the selectable object, and the fourth prompt information includes: voice information, and/or vibration information.
In a second aspect, an embodiment of the present application further provides an information processing apparatus in a game, where a game scene of a target game is at least partially displayed on a graphical user interface of a terminal, the apparatus includes: the device comprises a transmitting module and an output module;
the transmitting module responds to the detection instruction and transmits the virtual detection object to the game scene;
the output module outputs first prompt information when the virtual detector touches a target object in the game scene, wherein the first prompt information is used for prompting the virtual detector to touch the target object, and the first prompt information comprises: voice information, and/or vibration information.
Optionally, the transmitting module is specifically configured to transmit the virtual probe to the game scene according to a current aiming direction.
Optionally, the virtual probe does not affect a game attribute of the target object.
Optionally, the apparatus further comprises: an attack module;
and the attack module is used for responding to an attack instruction and releasing attack skills to the game scene.
Optionally, the output module is further configured to output second prompt information when the virtual probe bounces back to a virtual object controlled by a player through the target object, where the second prompt information is used to prompt the virtual probe to bounce back to the virtual object through the target object, and the second prompt information includes: voice information, and/or vibration information.
Optionally, the transmitting module is specifically configured to determine the current aiming direction in response to a detection instruction; and emitting the virtual probe object along the current aiming direction according to a preset time interval.
Optionally, the virtual probe comprises: a particle object; the emission module is specifically configured to emit the particle object along the current aiming direction at preset time intervals.
Optionally, the virtual probe comprises: a virtual ray; the emitting module is specifically configured to emit the virtual ray along the current aiming direction at preset time intervals.
Optionally, the output module is further configured to determine, according to the current aiming direction, that the output intensity of the first prompt information is a preset intensity corresponding to the current aiming direction, where the output intensity includes: volume or frequency; and outputting the first prompt message according to the output intensity.
Optionally, the transmitting module is further configured to transmit a virtual probe to a selectable object in the game scene in response to a probe instruction, where the selectable object is an object in a direction other than the current aiming direction;
the output module is further configured to output third prompt information when the virtual detector touches a selectable object in the game scene, where the third prompt information is used to prompt the virtual detector to touch the selectable object, and the third prompt information includes: voice information, and/or vibration information.
Optionally, the output module is further configured to output fourth prompt information when the virtual detector bounces to the virtual object via the selectable object, where the fourth prompt information is used to prompt the virtual detector to bounce to the virtual object via the selectable object, and the fourth prompt information includes: voice information, and/or vibration information.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the steps of a method of processing information in a game as provided in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the information processing method in a game as provided in the first aspect.
The beneficial effect of this application is:
the application provides an information processing method, an information processing device, electronic equipment and a storage medium in a game, wherein the method comprises the following steps: responding to a detection instruction, and transmitting a virtual detection object into a game scene; when the virtual object of detecting touches target object in the recreation scene, output first tip information, first tip information is used for the suggestion virtual object of detecting touches target object, and first tip information includes: voice information, and/or vibration information. In the method, in the process of playing a game by a player with visual impairment, by responding to a detection instruction, the player can be helped to judge the orientation information of a virtual object controlled by the player and a target object in a game scene in a mode of emitting a virtual detection object before the player does not attack, so that the player is helped to identify the game scene in the process of playing the game in advance, the obstacle is effectively avoided, and an attack target is accurately positioned. Compared with the prior art, the method has the advantages that the virtual object controlled by the player sends the prompt information when touching the obstacle, the target object in the game scene is detected in advance by emitting the detector, so that the player is not influenced by the obstacle in the process of fierce combat, the attack target is accurately positioned, and the game experience of the player with the visual disorder is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1 is a schematic flow chart illustrating an information processing method in a game according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 3 is a flow chart illustrating another method for processing information in a game according to an embodiment of the present disclosure;
FIG. 4 is a flow chart illustrating an information processing method in another game according to an embodiment of the present disclosure;
FIG. 5 is a flow chart illustrating an information processing method in another game according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of an information processing apparatus in a game according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
FIG. 1 is a schematic flow chart illustrating an information processing method in a game according to an embodiment of the present disclosure; FIG. 2 is a schematic diagram of a graphical user interface provided by an embodiment of the present application; the execution subject of the method can be a terminal or a server and other devices. The game scene of the target game is at least partially displayed on a graphical user interface of the terminal. As shown in fig. 1, the method may include:
and S101, responding to the detection instruction, and transmitting a virtual detector to the game scene.
One of the application scenes in the scheme of the application is a scene applied to a visual disturbance player to play a game, and the visual disturbance player cannot observe the actual situation of the game scene and can be hindered more in the game process.
Alternatively, the target game may include, but is not limited to, casual games and multiplayer online games, such as: shooting games, etc.
In an implementation manner, as shown in fig. 2, after a player starts a target game on a terminal, the graphical user interface displaying a game scene of the target game may include a trigger control, and the trigger control may be disposed at a preset position on the graphical user interface, so that a visually impaired player may know the position of the trigger control in advance, and thus may accurately input a detection instruction through the trigger control during a game, and in response to the detection instruction input by the player through the trigger control, emit a virtual probe into the game scene, where the virtual probe is used to help the player detect an obstacle or an object to be attacked in the game scene. Alternatively, the player may trigger the detection instruction by clicking or long-pressing a trigger control, etc. Wherein the trigger control may be any shape and is not limited to the shape shown in fig. 2.
In another implementation, the graphical user interface may also not include a trigger control, and the trigger control may be configured to detect an instruction in response to a preset gesture input by the player on the graphical user interface, such as: drawing a circle, drawing a V, sliding up or down the double fingers, double-clicking the back of the terminal and the like, and transmitting the virtual detection object into the game scene.
In other realizable manners, the terminal can also be operated by responding to the pressing operation of the keys input by the player on the terminal, such as: and double clicking the volume key, simultaneously pressing the volume key, the screen locking key and the like to emit the virtual detection object to the game scene.
The above exemplary list illustrates several possible detection instruction triggering operations, and in practical applications, the detection instruction triggering operations are not limited to the above listed ones, and there may be other implementation manners, which are not listed here.
S102, when the virtual detection object touches a target object in a game scene, outputting first prompt information, wherein the first prompt information is used for prompting the virtual detection object to touch the target object, and the first prompt information comprises: voice information, and/or vibration information.
Optionally, the target object may include: a target obstacle or a target avatar. The target virtual character may refer to a virtual character other than the virtual object controlled by the player.
In some embodiments, when the virtual object touches the target object, first prompt information is output to the player to prompt the player when the virtual object touches the target object. The first prompt message can be a voice message, or a vibration message, or a voice message and a vibration message. For the visually impaired player, the information is strongly sensed by the voice information and/or the vibration information, and the information can be acquired by the visually impaired player by using the voice information and/or the vibration information as prompt information.
Optionally, after the first prompt information is output, the player may determine the orientation information of the virtual object and the target object controlled by the player according to the time when the first prompt information is received and the flight speed of the virtual probe, where the orientation information includes: the direction of the target object relative to the virtual object, and the distance of the target object relative to the virtual object. The virtual object may refer to a game object controlled by a player in a game, and may be a virtual game character or a virtual game item.
It should be noted that, in a game scene, it is usually set that the transmitted virtual object runs at an even speed, for example: shot bullets, shot arrows, and the like. In this application it is assumed that the transmitted virtual probe is operating at uniform speed.
In one implementation, the player inputs the detection instruction in a state where the controlled virtual object is stationary, such as: the state of the virtual object is a state of rest, lying down, aiming, shooting, and the like. Then, according to the physical calculation, the player can estimate the distance from the target object to the virtual object according to the time when the detection instruction is input, the time when the first prompt information is received, and the flying speed of the virtual detection object, and can determine the orientation of the target object according to the direction in which the virtual object is located when the detection instruction is input.
In another implementation, the player inputs the detection instruction in a state of the controlled virtual object motion, such as: the state of the virtual object is walking, running, or the like, and in the present embodiment, it is also assumed that the virtual object moves at a constant speed. According to the physical calculation, the player can estimate the distance from the target object to the virtual object according to the time of inputting the detection instruction, the time of receiving the first prompt information, the flying speed of the virtual detector and the moving speed of the virtual object, and can determine the orientation of the target object according to the direction in which the virtual object is located when the detection instruction is input.
Alternatively, the above shows that the player estimates the orientation information between the target object and the virtual object based on the time when the first prompt information is received, assuming that the flight speed of the virtual probe and the movement speed of the virtual object are the same speed.
In practical applications, the flight speed of the virtual probe and the movement speed of the virtual object may be non-uniform, and the direction information of the target object and the virtual object can be estimated by the above method, but in this case, the calculation difficulty of estimation is slightly large, and the reaction time required by the player is long.
In summary, the information processing method in a game provided by the embodiment includes: responding to a detection instruction, and transmitting a virtual detection object into a game scene; when the virtual object of detecting touches target object in the recreation scene, output first tip information, first tip information is used for the suggestion virtual object of detecting touches target object, and first tip information includes: voice information, and/or vibration information. In the method, in the process of playing a game by a player with visual impairment, by responding to a detection instruction, the player can be helped to judge the orientation information of a virtual object controlled by the player and a target object in a game scene in a mode of emitting a virtual detection object before the player does not attack, so that the player is helped to identify the game scene in the process of playing the game in advance, the obstacle is effectively avoided, and an attack target is accurately positioned. Compared with the prior art, the method has the advantages that the virtual object controlled by the player sends the prompt information when touching the obstacle, the target object in the game scene is detected in advance by emitting the detector, so that the player is not influenced by the obstacle in the process of fierce combat, the attack target is accurately positioned, and the game experience of the player with the visual disorder is improved.
Optionally, in step S101, the transmitting a virtual probe to the game scene may include: and emitting the virtual probe into the game scene according to the current aiming direction.
Alternatively, the current aiming direction may be determined based on the current orientation of the virtual object controlled by the player, or based on the current viewing direction. It may be determined that the physical orientation of the virtual object controlled by the player when the player inputs the detection instruction is the current orientation of the virtual object. The current visual field orientation may refer to the eye orientation of the virtual object controlled by the player when the player inputs the detection instruction.
Optionally, the virtual probe described in this application does not affect the game attributes of the target object. That is, the virtual probe has only a probe function, but does not have a function of changing the game attribute (e.g., life value, appearance, etc.) of the target object such as an attack, which is different from a virtual item used for making an attack in a game scene, such as: virtual bullets, virtual arches, and the like. It does not cause physical damage or destruction to the target object.
Optionally, the method of the present application may further include: and responding to the attack instruction, and releasing the attack skill into the game scene.
It should be noted that, in the above embodiments, it is described that before the virtual object controlled by the player initiates an attack, the target object in the game scene is detected by emitting the virtual detector, and in this embodiment, after the orientation information of the target object and the virtual object is detected, the attack instruction input by the player may be further responded, so as to release the attack skill into the game scene.
When the target object is an obstacle, the obstacle can be avoided according to the determined azimuth information of the target object and the virtual object, and an attack instruction is input to release the attack skill to the attack target. When the target object is the target to be attacked, an attack instruction can be input according to the determined azimuth information of the target object and the virtual object so as to release the attack skill to the attack target. Thereby improving the game combat experience of the player.
Optionally, in an implementable manner, the transmitting a virtual probe into the game scene in response to the probe instruction in step S101 may include: responding to a detection instruction input by a player according to a preset interval time, transmitting a virtual detection object to a game scene, and outputting first prompt information when the virtual detection object touches a target object.
Alternatively, the player may trigger the virtual probe to be transmitted by inputting the probe instruction only once, so that the orientation information of the target object and the virtual object is estimated based on the received first prompt information.
In addition, when the virtual object is in a motion state, the player can input a detection instruction according to the preset interval time in the motion process of the virtual object, so that the first prompt information is continuously received for multiple times in the process of approaching the target object, and the direction information of the target object and the virtual object is estimated. Through detection for many times, the accuracy of the detected target object can be improved to a certain extent.
Optionally, the method of the present application may further include: outputting second prompt information when the virtual detection object rebounds to the virtual object controlled by the player through the target object, wherein the second prompt information is used for prompting the virtual detection object to rebound to the virtual object through the target object, and the second prompt information comprises: voice information, and/or vibration information.
In an embodiment, after the virtual object touches the target object, the virtual object may bounce to the virtual object through the target object, and optionally, a bounce process of the virtual object may be detected, and when the virtual object rebounds to touch the virtual object, a second prompt message may be output to the player, where the second prompt message may be identical to the first prompt message or may be another prompt message different from the first prompt message.
Optionally, when the virtual object is in a motion state, the second prompt message may help the player to determine more accurately the real-time orientation information of the virtual object and the target object during the motion process.
In one implementation, the player enters the detection instruction while the controlled virtual object is stationary. Then, the output of the second prompting message is an unnecessary condition in this case, as can be seen from the physical calculation. The player can also estimate the distance of the target object from the virtual object based on the time of inputting the detection instruction, the time of receiving the first prompt information, and the flying speed of the virtual detector.
In yet another implementation, the player enters the detection instruction in a state where the controlled virtual object is moving. Then, according to the physical calculation, the player can estimate the distance of the target object from the virtual object based on the time of inputting the detection instruction, the time of receiving the first prompt information, the flight speed of the virtual probe, the movement speed of the virtual object, and the time of receiving the second prompt information.
Similarly, the virtual probe and the virtual object may operate at a non-constant speed by referring to a case where only the first prompt information is received.
FIG. 3 is a flow chart illustrating another method for processing information in a game according to an embodiment of the present disclosure; optionally, in step S101, in response to the detection instruction, transmitting a virtual detector into the game scene, which may include:
and S301, responding to the detection instruction, and determining the current aiming direction.
It should be noted that, for the device such as the execution main body terminal or the server of the method, in the process of calculating and outputting the first prompt information or the second prompt information, it is not necessary to render in a game scene and generate a special emission effect of a virtual probe to be displayed on a graphical user interface, and the time when the player inputs the trigger control detection instruction, the flight speed of the virtual probe, and the running speed of the virtual object can be acquired from a local or server background in real time, so that the time when the first prompt information is output and/or the time when the second prompt information is output can be calculated, and the prompt information is output according to the determined time.
Optionally, in response to the detection instruction, a current aiming direction, i.e. a current orientation or a visual field direction of the virtual object controlled by the player, may be determined first, so that a special effect of emitting the virtual detection object may be rendered and displayed along the current aiming direction.
And S302, emitting the virtual detection object along the current aiming direction according to a preset time interval.
Optionally, based on the determined direction of the virtual object, the emission virtual probe may be displayed in the direction of the virtual object at preset time intervals.
The virtual detection object is displayed frame by frame in the process of flying to the target object. The preset interval time interval can be determined according to the preset flying speed of the virtual detector, and the display position of each frame in the process that the virtual detector flies to the target object from the initial position can be determined according to the initial position of the virtual detector, the preset interval time and the flying speed of the virtual detector, so that the emission virtual detector is displayed at the corresponding position on the graphical user interface according to the display position of the virtual detector determined by the frame under each frame, and a normal player can visually observe the flying process of the virtual detector and the state of the virtual detector when receiving the prompt message.
The starting position of the virtual detector may be determined according to the geometric center of the virtual object or the virtual item used by the virtual object, or the position of the virtual item used by the virtual object or the end of the virtual item close to the target object may be used as the starting position.
It should be noted that, when the virtual object controlled by the player is a virtual character in a fighting game, the geometric center of the virtual character or a position near one end of the target object may be used as the starting position.
When the virtual object controlled by the player is also a virtual character in the shooting game, and shooting is performed by controlling the virtual character, the geometric center of the virtual gun or the position near one end of the target object can be used as the starting position.
The adaptive adjustment can be specifically performed according to different game scenes and the conditions of the virtual objects controlled by the player. Whether the geometric center of the virtual object or the virtual prop used by the virtual object is used for determination, or the position of one end, close to the target object, of the virtual prop used by the virtual object or the virtual prop used by the virtual object can be used as an initial position, the error generated by the position is small and can be ignored, and a player can be helped to avoid obstacles more accurately on the basis of the existing method.
In one implementation, the virtual probe may include: a particle object; in step S302, emitting the virtual probe at preset time intervals along the current aiming direction, which may include: the particle objects are emitted at preset time intervals along the current aiming direction.
Optionally, when the virtual probe is a particle object, a particle model may be rendered in advance to generate a particle special effect, so that the particle object is displayed at the display position corresponding to each frame according to the above method. Because the flight speed of the particles is better, the player can receive the first prompt message or the second prompt message in the shortest time.
In another implementation, optionally, the virtual probe may include: a virtual ray; in step S302, emitting the virtual probe at preset time intervals along the current aiming direction, which may include: and emitting the virtual rays at preset time intervals along the current aiming direction.
Optionally, the virtual ray may be a virtual laser, and a laser model may be rendered in advance according to the same method as described above to generate a laser special effect, so that the laser objects are displayed at the display positions corresponding to each frame according to the method described above.
It should be noted that, in the case where the virtual ray is a virtual laser, due to the characteristics of the laser itself, the propagation is far, the propagation speed is high, and the like, when the virtual ray is a virtual laser and the virtual laser touches the target object, the voice broadcast information for prompting the player about the direction information of the target object and the virtual object may be directly output.
FIG. 4 is a flow chart illustrating an information processing method in another game according to an embodiment of the present disclosure; optionally, in step S102, when the virtual object touches the target object in the game scene, outputting the first prompt information may include:
s401, according to the current aiming direction, determining the output intensity of the first prompt message as a preset intensity corresponding to the current aiming direction, wherein the output intensity comprises: volume or frequency.
It should be noted that, in the method of the present application, when outputting the first prompt message or the second prompt message, the output intensity of the prompt message may be determined according to the emission direction of the emitted virtual probe, and in the above description, the emission direction of the virtual probe may be determined by the current aiming direction.
Optionally, for different emission directions of the virtual probe, the intensity of the prompt information emitted when the virtual probe touches the target object in each direction may be preset, so that the output intensity of the prompt information corresponding to the current aiming direction determined when the virtual probe is emitted may be determined according to the current aiming direction. To help the player to more clearly distinguish the direction and to estimate the orientation information of the target object and the virtual object on each method.
Alternatively, for voice information, the detection results in different directions can be distinguished by matching different voice volume intensities for the respective directions.
For the vibration information, the detection results in different directions can be distinguished by matching different vibration intensities for all directions.
S402, outputting the first prompt information according to the output intensity.
Alternatively, the first prompt message may be output according to the determined output intensity, and the second prompt message in the same direction may have the same output intensity as the second prompt message or a different output intensity.
FIG. 5 is a flow chart illustrating an information processing method in another game according to an embodiment of the present disclosure; optionally, the method of the present application may further include:
s501, responding to a detection instruction, and emitting a virtual detection object to an optional object in the game scene, wherein the optional object is an object in a direction other than the current aiming direction of the player.
In contrast to the method implemented above, in the present embodiment, when responding to the detection instruction input by the player, the virtual probe may be launched in a direction other than the current aiming direction, instead of launching the virtual probe in the current aiming direction as described above.
In this case, the player can detect the target object in the other direction by emitting the virtual probe in the other direction at the current position of the virtual object without controlling the virtual object to change the direction.
For the player with visual impairment, it takes a certain time to control the direction change of the virtual object, and in the method, the operation complexity of the player can be reduced, the operation time can be saved, and the purpose of detecting the target object can be achieved.
S502, when the virtual detection object touches the selectable object, outputting third prompt information, wherein the third prompt information is used for prompting the virtual detection object to touch the selectable object, and the third prompt information comprises: voice information, and/or vibration information.
Similar to the method, when the virtual detection object touches the selectable object, third prompt information is output, and the third prompt information is similar to the meaning represented by the first prompt information. Thereby, the player can estimate the direction information of the selectable object from the virtual object in the other direction except the current aiming direction according to the third prompt information.
Optionally, on the basis of the steps shown in fig. 5, the method of the present application may further include: when the virtual detection object rebounds to the virtual object through the selectable object, outputting fourth prompt information, wherein the fourth prompt information is used for prompting the virtual detection object to rebound to the virtual object through the selectable object, and the fourth prompt information comprises: voice information, and/or vibration information.
The implementation of the step is similar to the implementation process of emitting the virtual detection object along the direction of the virtual object, and the fourth prompt message has similar meaning to that represented by the second prompt message. Thereby, the player can estimate the direction information of the selectable object from the virtual object in the other direction except the current aiming direction according to the third prompt information and/or the fourth prompt information.
To sum up, an information processing method in a game provided by the embodiment of the present application includes: responding to a detection instruction, and transmitting a virtual detection object into a game scene; when the virtual object of detecting touches target object in the recreation scene, output first tip information, first tip information is used for the suggestion virtual object of detecting touches target object, and first tip information includes: voice information, and/or vibration information. In the method, in the process of playing a game by a player with visual impairment, by responding to a detection instruction, the player can be helped to judge the orientation information of a virtual object controlled by the player and a target object in a game scene in a mode of emitting a virtual detection object before the player does not attack, so that the player is helped to identify the game scene in the process of playing the game in advance, the obstacle is effectively avoided, and an attack target is accurately positioned. Compared with the prior art, the method has the advantages that the virtual object controlled by the player sends the prompt information when touching the obstacle, the target object in the game scene is detected in advance by emitting the detector, so that the player is not influenced by the obstacle in the process of fierce combat, the attack target is accurately positioned, and the game experience of the player with the visual disorder is improved.
The following describes an apparatus, a device, a storage medium, and the like for executing the information processing method in a game provided by the present application, and specific implementation procedures and technical effects thereof are referred to above, and will not be described again below.
Fig. 6 is a schematic diagram of an information processing apparatus in a game according to an embodiment of the present application, where functions implemented by the information processing apparatus in the game correspond to steps executed by the foregoing method. The device may be understood as the above terminal or server, or a processor of the server, or may be understood as a component that implements the functions of the present application under the control of the server, separately from the above server or processor, and displays a game scene of the target game at least partially on a graphical user interface of the terminal, as shown in fig. 6, and the device may include: a transmitting module 610 and an output module 620;
the transmitting module 610 transmits a virtual probe to a game scene in response to a probe instruction;
the output module 620 outputs first prompt information when the virtual detector touches the target object in the game scene, where the first prompt information is used to prompt the virtual detector to touch the target object, and the first prompt information includes: voice information, and/or vibration information.
Optionally, the transmitting module 610 is specifically configured to transmit the virtual probe to the game scene according to the current aiming direction.
Optionally, the virtual probe does not affect the game attributes of the target object.
Optionally, the apparatus further comprises: an attack module;
and the attack module is used for responding to the attack instruction and releasing the attack skill to the game scene.
Optionally, the output module 620 is further configured to output a second prompt message when the virtual object bounces back to the virtual object controlled by the player via the target object, where the second prompt message is used to prompt the virtual object to bounce back to the virtual object via the target object, and the second prompt message includes: voice information, and/or vibration information.
Optionally, the transmitting module 610 is specifically configured to determine a current aiming direction in response to the detection instruction; and emitting the virtual detection object along the current aiming direction at preset time intervals.
Optionally, the virtual probe comprises: a particle object; and the emission module is specifically used for emitting the particle object along the current aiming direction according to a preset time interval.
Optionally, the virtual probe comprises: a virtual ray; and the emission module is specifically used for emitting the virtual ray along the current aiming direction according to a preset time interval.
Optionally, the output module 620 is further configured to determine, according to the current aiming direction, that the output intensity of the first prompt information is a preset intensity corresponding to the current aiming direction, where the output intensity includes: volume or frequency; and outputting the first prompt message according to the output intensity.
Optionally, the transmitting module 610 is further configured to transmit a virtual probe to a selectable object in the game scene in response to the probe instruction, where the selectable object is an object in a direction other than the current aiming direction;
the output module 620 is further configured to output third prompt information when the virtual detector touches a selectable object in the game scene, where the third prompt information is used to prompt the virtual detector to touch the selectable object, and the third prompt information includes: voice information, and/or vibration information.
Optionally, the output module 620 is further configured to output fourth prompt information when the virtual object bounces to the virtual object via the selectable object, where the fourth prompt information is used to prompt the virtual object to bounce to the virtual object via the selectable object, and the fourth prompt information includes: voice information, and/or vibration information.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
The modules may be connected or in communication with each other via a wired or wireless connection. The wired connection may include a metal cable, an optical cable, a hybrid cable, etc., or any combination thereof. The wireless connection may comprise a connection over a LAN, WAN, bluetooth, ZigBee, NFC, or the like, or any combination thereof. Two or more modules may be combined into a single module, and any one module may be divided into two or more units. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to corresponding processes in the method embodiments, and are not described in detail in this application.
It should be noted that the above modules may be one or more integrated circuits configured to implement the above methods, for example: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, the modules may be integrated together and implemented in the form of a System-on-a-chip (SOC).
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application, where the electronic device may be a computing device with a data processing function.
The apparatus may include: a processor 801 and a memory 802.
The memory 802 is used for storing programs, and the processor 801 calls the programs stored in the memory 802 to execute the above-mentioned method embodiments. The specific implementation and technical effects are similar, and are not described herein again.
In which the memory 802 stores program code that, when executed by the processor 801, causes the processor 801 to perform the various steps in the information processing method in a game according to various exemplary embodiments of the present application described in the "exemplary methods" section above in this specification.
The Processor 801 may be a general-purpose Processor, such as a Central Processing Unit (CPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware components, and may implement or execute the methods, steps, and logic blocks disclosed in the embodiments of the present Application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
Memory 802, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules. The Memory may include at least one type of storage medium, and may include, for example, a flash Memory, a hard disk, a multimedia card, a card-type Memory, a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Programmable Read Only Memory (PROM), a Read Only Memory (ROM), a charged Erasable Programmable Read Only Memory (EEPROM), a magnetic Memory, a magnetic disk, an optical disk, and so on. The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory 802 in the embodiments of the present application may also be circuitry or any other device capable of performing a storage function for storing program instructions and/or data.
Optionally, the present application also provides a program product, such as a computer readable storage medium, comprising a program which, when being executed by a processor, is adapted to carry out the above-mentioned method embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to perform some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (14)

1. An in-game information processing method for displaying, at least in part, a game scene of a target game on a graphical user interface of a terminal, the method comprising:
responding to a detection instruction, and transmitting a virtual detection object into a game scene;
when the virtual detection object touches a target object in the game scene, outputting first prompt information, wherein the first prompt information is used for prompting the virtual detection object to touch the target object, and the first prompt information comprises: voice information, and/or vibration information.
2. The method of claim 1, wherein said launching a virtual probe into a game scene comprises:
and transmitting the virtual probe into the game scene according to the current aiming direction.
3. The method of claim 1, wherein the virtual probe does not affect a game attribute of the target object.
4. The method according to any one of claims 1-3, further comprising:
and responding to an attack instruction, and releasing the attack skill into the game scene.
5. The method according to any one of claims 1-3, further comprising:
outputting second prompt information when the virtual probe bounces to a virtual object controlled by a player through the target object, wherein the second prompt information is used for prompting the virtual probe to bounce to the virtual object through the target object, and the second prompt information comprises: voice information, and/or vibration information.
6. The method of claim 2, wherein said launching the virtual probe into the game scene according to the current aiming direction comprises:
determining the current aiming direction in response to a detection instruction;
and emitting the virtual probe object along the current aiming direction according to a preset time interval.
7. The method of claim 6, wherein the virtual probe comprises: a particle object; the emitting the virtual probe along the current aiming direction at preset time intervals comprises:
emitting the particle object along the current aiming direction at preset time intervals.
8. The method of claim 6, wherein the virtual probe comprises: a virtual ray; the emitting the virtual probe along the current aiming direction at preset time intervals comprises:
and emitting the virtual ray along the current aiming direction according to a preset time interval.
9. The method of claim 2, wherein outputting a first prompt when the virtual object touches a target object in the game scene comprises:
according to the current aiming direction, determining the output intensity of the first prompt message as a preset intensity corresponding to the current aiming direction, wherein the output intensity comprises: volume or frequency;
and outputting the first prompt message according to the output intensity.
10. The method of claim 2, further comprising:
in response to a detection instruction, emitting a virtual probe to a selectable object in the game scene, the selectable object being an object in a direction other than the current aiming direction;
when the virtual detection object touches a selectable object in the game scene, outputting third prompt information, wherein the third prompt information is used for prompting the virtual detection object to touch the selectable object, and the third prompt information comprises: voice information, and/or vibration information.
11. The method of claim 10, further comprising:
outputting fourth prompt information when the virtual probe bounces to the virtual object via the selectable object, where the fourth prompt information is used to prompt the virtual probe to bounce to the virtual object via the selectable object, and the fourth prompt information includes: voice information, and/or vibration information.
12. An in-game information processing apparatus that displays, at least in part, a game scene of a target game on a graphical user interface of a terminal, the apparatus comprising: the device comprises a transmitting module and an output module;
the transmitting module responds to the detection instruction and transmits the virtual detection object to the game scene;
the output module outputs first prompt information when the virtual detector touches a target object in the game scene, wherein the first prompt information is used for prompting the virtual detector to touch the target object, and the first prompt information comprises: voice information, and/or vibration information.
13. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing program instructions executable by the processor, the processor and the storage medium communicating via the bus when the electronic device is running, the processor executing the program instructions to perform the steps of the information processing method in a game according to any one of claims 1 to 11.
14. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of an information processing method in a game according to any one of claims 1 to 11.
CN202110025526.5A 2021-01-08 2021-01-08 Information processing method and device in game, electronic equipment and storage medium Pending CN112717384A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110025526.5A CN112717384A (en) 2021-01-08 2021-01-08 Information processing method and device in game, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110025526.5A CN112717384A (en) 2021-01-08 2021-01-08 Information processing method and device in game, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112717384A true CN112717384A (en) 2021-04-30

Family

ID=75589828

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110025526.5A Pending CN112717384A (en) 2021-01-08 2021-01-08 Information processing method and device in game, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112717384A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113521738A (en) * 2021-08-11 2021-10-22 网易(杭州)网络有限公司 Special effect generation method and device, computer readable storage medium and electronic equipment
WO2023029626A1 (en) * 2021-08-30 2023-03-09 网易(杭州)网络有限公司 Avatar interaction method and apparatus, and storage medium and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060274911A1 (en) * 2002-07-27 2006-12-07 Xiadong Mao Tracking device with sound emitter for use in obtaining information for controlling game program execution
CN104546389A (en) * 2015-01-13 2015-04-29 江苏怡龙医疗科技有限公司 Intelligent walking stick and intelligent walking stick system
CN105787442A (en) * 2016-02-19 2016-07-20 电子科技大学 Visual interaction based wearable auxiliary system for people with visual impairment, and application method thereof
CN108339272A (en) * 2018-02-12 2018-07-31 网易(杭州)网络有限公司 Virtual shooting main body control method and device, electronic equipment, storage medium
CN111870947A (en) * 2020-08-10 2020-11-03 网易(杭州)网络有限公司 Game interaction method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060274911A1 (en) * 2002-07-27 2006-12-07 Xiadong Mao Tracking device with sound emitter for use in obtaining information for controlling game program execution
CN104546389A (en) * 2015-01-13 2015-04-29 江苏怡龙医疗科技有限公司 Intelligent walking stick and intelligent walking stick system
CN105787442A (en) * 2016-02-19 2016-07-20 电子科技大学 Visual interaction based wearable auxiliary system for people with visual impairment, and application method thereof
CN108339272A (en) * 2018-02-12 2018-07-31 网易(杭州)网络有限公司 Virtual shooting main body control method and device, electronic equipment, storage medium
CN111870947A (en) * 2020-08-10 2020-11-03 网易(杭州)网络有限公司 Game interaction method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
匿名: "史上最强大的辅助功能?!助你通关《最后生还者 第二部》!", 《HTTPS://WWW.BILIBILI.COM/VIDEO/BV18V411K7NJ?FROM=SEARCH&SEID=5440220946363883060&SPM_ID_FROM=333.337.0.0》 *
马博华著: "《让生命更美好 多学科交融的医学》", 30 April 2000, 福建教育出版社 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113521738A (en) * 2021-08-11 2021-10-22 网易(杭州)网络有限公司 Special effect generation method and device, computer readable storage medium and electronic equipment
WO2023029626A1 (en) * 2021-08-30 2023-03-09 网易(杭州)网络有限公司 Avatar interaction method and apparatus, and storage medium and electronic device

Similar Documents

Publication Publication Date Title
CN110548288B (en) Virtual object hit prompting method and device, terminal and storage medium
JP6104515B2 (en) Program, information storage medium, electronic device and server system
EP4000704A1 (en) Virtual object control method, device, terminal, and storage medium
CN111124226A (en) Game screen display control method and device, electronic equipment and storage medium
CN111084986B (en) Display control method, display control device, storage medium, and electronic device
US9833695B2 (en) System and method for presenting a virtual counterpart of an action figure based on action figure state information
CN110465087B (en) Virtual article control method, device, terminal and storage medium
US9259651B1 (en) System and method for providing relevant notifications via an action figure
CN109821238B (en) Method and device for aiming in game, storage medium and electronic device
WO2017190680A1 (en) Device control system, method, apparatus and control device
JP6450875B1 (en) GAME PROGRAM, GAME METHOD, AND INFORMATION PROCESSING DEVICE
CN112717384A (en) Information processing method and device in game, electronic equipment and storage medium
JP5642352B2 (en) Game device, game program
CN113082709A (en) Information prompting method and device in game, storage medium and computer equipment
CN111773656A (en) Virtual object selection control method and device, handheld terminal and storage medium
CN112221135B (en) Picture display method, device, equipment and storage medium
JP2015019885A (en) Game program and game system
KR101834986B1 (en) Game system and method supporting disappearance processing
JP6383822B2 (en) Game device, game program
US20160236080A1 (en) System and method for providing state information of an action figure
JP6093884B2 (en) Game device, game program
JP6167191B2 (en) Game device, game program
WO2016130378A1 (en) System and method for presenting a virtual counterpart of an action figure based on state information
JP2020110453A (en) Game program, method, and information processing device
US11400377B2 (en) Techniques for video game input compensation and related systems and methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination