CN107992252B - Information prompting method and device, electronic equipment and storage medium - Google Patents

Information prompting method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN107992252B
CN107992252B CN201711210947.5A CN201711210947A CN107992252B CN 107992252 B CN107992252 B CN 107992252B CN 201711210947 A CN201711210947 A CN 201711210947A CN 107992252 B CN107992252 B CN 107992252B
Authority
CN
China
Prior art keywords
target object
identifier
distance
information prompting
attribute
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711210947.5A
Other languages
Chinese (zh)
Other versions
CN107992252A (en
Inventor
陈振华
金明
罗芮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201711210947.5A priority Critical patent/CN107992252B/en
Publication of CN107992252A publication Critical patent/CN107992252A/en
Application granted granted Critical
Publication of CN107992252B publication Critical patent/CN107992252B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides an information prompting method, an information prompting device, electronic equipment and a computer readable storage medium, and relates to the technical field of human-computer interaction. The method comprises the following steps: acquiring an attribute parameter of at least one target object in a preset monitoring range, and judging whether the attribute parameter meets a trigger condition; and when the attribute parameters meet the trigger conditions, providing a visual prompt control corresponding to the target object. This openly can promote the screen utilization ratio, and improved information transmission efficiency.

Description

Information prompting method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of human-computer interaction, and in particular, to an information prompting method and an information prompting apparatus, an electronic device, and a computer-readable storage medium.
Background
With the rapid development of mobile communication technology, a large number of game applications have emerged. In various team collaborative gaming applications, a game scene typically includes a plurality of virtual objects and enemy objects.
In the related art, for MOBA (Multiplayer Online Battle Arena) mobile game application, specific information about other members of a team, such as a virtual object, is basically realized by a small map located at the upper left corner or the upper right corner of a screen, and a user can obtain related information such as blood volume and position of a teammate by looking up the small map and hero information around the small map, as shown in fig. 1; for the application of 3D-like FPS (First-person shooter game) game in end-play, referring to fig. 2, information of other members of the team including blood volume, position, skill CD, etc. can be obtained in real time through the game scene.
In the mode, the small map is not in the visual core area of the game scene, so that the state and information of teammates are easier to ignore for users. Moreover, since the information carrying capacity of the small map is too large, a user needs a certain time to determine the position and the relevant state of the teammates when viewing the information of the teammates. When the battle situation is compact, teammates often have no time to realize mutual assistance, and the user experience is poor; for non-3D mobile game applications, the information of teammates cannot be known through the scene due to the viewing angle.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an information presentation method and an information presentation apparatus, an electronic device, and a computer-readable storage medium, which overcome one or more of the problems due to the limitations and disadvantages of the related art, at least to some extent.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the present disclosure, an information prompting method is provided, which is applied to a terminal capable of presenting an interactive interface of at least a part of a game scene, and the game scene at least includes a virtual object, including:
acquiring an attribute parameter of at least one target object in a preset monitoring range, and judging whether the attribute parameter meets a trigger condition;
and when the attribute parameters meet the trigger conditions, providing a visual prompt control corresponding to the target object.
In an exemplary embodiment of the present disclosure, the attribute parameter satisfying the trigger condition includes:
the blood volume value of the target object is less than one or more of a first threshold value, the damage value acting on the target object is greater than a second threshold value, and the number of enemy objects within a preset range is greater than a third threshold value.
In an exemplary embodiment of the present disclosure, providing a visual cue control corresponding to the target object includes:
and acquiring the position of the target object and providing a position identifier corresponding to the position of the target object.
In an exemplary embodiment of the present disclosure, providing the visual cue control corresponding to the target object further comprises:
acquiring the current positions of the target object and the virtual object, and calculating the distance between the target object and the virtual object;
and providing a distance identifier according to the distance between the target object and the virtual object, wherein a preset mapping relation exists between the display parameter of the distance identifier and the distance.
In an exemplary embodiment of the present disclosure, the display parameter includes one or more of transparency, color, size, and brightness of the distance indicator.
In an exemplary embodiment of the present disclosure, providing the visual cue control corresponding to the target object further comprises:
and providing an attribute identifier according to the attribute parameters of the target object.
In an exemplary embodiment of the present disclosure, the attribute identifier includes a blood volume bar and/or an enemy object number identifier, wherein the blood volume bar is used for displaying the current blood volume of the target object, and the enemy object number identifier is used for displaying the number of enemy objects within a preset range of the target object.
In an exemplary embodiment of the present disclosure, providing a visual cue control corresponding to the target object includes:
and acquiring the orientation of the target object, and adjusting the display position of the visual prompt control on the interactive interface according to the orientation of the target object.
In an exemplary embodiment of the present disclosure, the method further comprises:
when the target object is detected to be located in the interactive interface presented by the virtual lens, hiding the visual cue control corresponding to the target object.
According to an aspect of the present disclosure, an information prompting device is provided, which is applied to a terminal capable of presenting an interactive interface of at least a part of a game scene, and the game scene at least includes a virtual object, including:
the parameter judgment module is used for acquiring the attribute parameters of at least one target object in a preset monitoring range and judging whether the attribute parameters meet the triggering conditions;
and the control providing module is used for providing a visual prompt control corresponding to the target object when the attribute parameters meet the trigger conditions.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information prompting method described in any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform any of the information prompting methods described above via execution of the executable instructions.
In a target object control information prompting method, an information prompting device, an electronic device, and a computer-readable storage medium provided by an exemplary embodiment of the present disclosure, an attribute parameter of at least one target object within a preset monitoring range is obtained, and whether the attribute parameter satisfies a trigger condition is determined; and when the attribute parameters meet the trigger conditions, providing a visual prompt control corresponding to the target object. On one hand, the current state of the target object can be accurately determined by judging whether the attribute parameters of the target object meet the triggering conditions, so that effective matching among a plurality of objects can be realized; on one hand, the position information of the target object can be timely conveyed by providing the visual prompt control, the problem of transmission delay caused by the fact that a user pays attention to the visual prompt control on the virtual map in real time in the game process is avoided, the interaction effect is improved, and the experience of team cooperation in the game is further improved; on the other hand, the visual prompt control is displayed only when the attribute parameters of the target object meet the triggering conditions, so that the quantity of information presented on an interactive interface is reduced, and the screen utilization rate and the information transmission efficiency are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 is a diagram illustrating an interaction interface of a mobile game application A in the related art;
FIG. 2 is a diagram illustrating an interaction interface of a game application B in the related art;
FIG. 3 is a schematic diagram illustrating an information prompting method in an exemplary embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating an interactive interface for providing location information in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a schematic view of a blood volume strip in an exemplary embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an information presentation apparatus according to an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a block diagram of an electronic device in an exemplary embodiment of the disclosure;
fig. 8 schematically illustrates a program product in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
The exemplary embodiment first discloses an information prompting method, which can be applied to a touch terminal capable of presenting an interactive interface of at least part of a game scene, wherein the game scene at least comprises a virtual object, and the interactive interface can be obtained by executing a software application on a processor of the terminal and rendering the application on a display of the terminal. The terminal can be various electronic devices with touch screens, such as a mobile phone, a tablet computer, a notebook computer, a game machine, a PDA and the like, and can also be electronic devices such as a desktop computer and the like. It should be noted that, in some non-touch devices, it is also possible to simulate a keyboard operation and a mouse operation as a touch operation by means of a simulator or the like, and this manner can also be regarded as a touch terminal described in this disclosure. The information prompting method in this example can be applied to various mobile game applications or client game applications, and the mobile game applications are taken as an example for description here. Referring to fig. 3, the information prompting method may include the following steps:
s310, acquiring an attribute parameter of at least one target object in a preset monitoring range, and judging whether the attribute parameter meets a trigger condition;
and S320, providing a visual prompt control corresponding to the target object when the attribute parameters meet the trigger conditions.
According to the information prompting method in the embodiment, on one hand, the current state of the target object can be accurately determined by judging whether the attribute parameters of the target object meet the triggering condition, and then effective matching among a plurality of objects can be realized; on one hand, the position information of the target object can be timely conveyed by providing the visual prompt control, the problem of transmission delay caused by the fact that a user pays attention to the visual prompt control on the virtual map in real time in the game process is avoided, the interaction effect is improved, and the experience of team cooperation in the game is further improved; on the other hand, the visual prompt control is displayed only when the attribute parameters of the target object meet the triggering conditions, so that the quantity of information presented on an interactive interface is reduced, and the screen utilization rate and the information transmission efficiency are improved.
Next, each step in the information presentation method is further explained with reference to fig. 3 to 7.
In step S310, an attribute parameter of at least one target object within a preset monitoring range is obtained, and it is determined whether the attribute parameter meets a trigger condition.
In the exemplary embodiment, an identifier corresponding to one or more target objects may be presented on the interactive interface, and the target objects may be friend objects located in different interactive interfaces with the virtual object, that is, the distance between the virtual object and the target object may be understood to be greater than a certain value. The identifications of the target objects can be the same or different, and can be located at any position of the interactive interface. Besides, a virtual rocker control and various function controls can be presented on the interactive interface, and a virtual map can also be displayed.
In the present exemplary embodiment, the preset monitoring range may be determined centering on a virtual object corresponding to the user. Specifically, the whole game process may be divided into a plurality of stages, and the size of the preset monitoring range may be adjusted in real time according to preset parameters of each stage. For example, the whole game process may be segmented according to the game level, the whole game process may be segmented according to the position of the target object, or the whole game process may be segmented according to the task.
The preset monitoring range corresponding to each stage is a part of the whole game scene, the sizes of the preset monitoring ranges can be the same or different, and the specific size of the preset monitoring range can be set according to actual requirements. In this example, the size of the preset monitoring range is dynamically adjusted in real time, so that it can be ensured that when the virtual object or the target object is in a dangerous state, other virtual objects in the preset monitoring range have enough time to rescue, the situation that the number of times of occurrence of a prompt message is large due to the fact that the monitoring range is too large is avoided, and the balance between the size of the monitoring range and the speed of rescue is ensured.
The system can monitor the attribute parameters of the target object in a certain preset monitoring range in real time through the sensor. The attribute parameters of the target object may include, for example, a current blood volume value of the target object, a current injury value, a number of surrounding enemy objects, and the like. Further, whether the target object is in an abnormal state or not can be judged by judging whether the acquired attribute parameters of the target object meet the triggering condition or not. The abnormal state may be understood as a dangerous state, that is, whether the target object is in a dangerous state may be determined by the acquired current blood volume value of the target object and the number of surrounding enemy objects or other parameters.
In step S320, when the attribute parameter satisfies the trigger condition, a visual cue control corresponding to the target object is provided.
In the present exemplary embodiment, when the target object is determined to be in a dangerous state through the attribute parameter, it may be determined whether the virtual object is located within a corresponding preset monitoring range. If a virtual object or other friend object exists within the preset monitoring range, a visual cue control corresponding to the target object in a dangerous state may be provided at a preset position around the virtual object or friend object. The visual prompt control can be used for reminding the target object of relevant information such as blood volume, position and the like, so that a user can determine whether to perform assistance operation on the target object according to the visual prompt control. The visual cue controls may be, for example, numeric, textual, iconic, or other forms of cue information.
On this basis, in order to accurately determine whether the target object is in a dangerous state, the attribute parameter satisfying the trigger condition in this example embodiment may include:
the blood volume value of the target object is less than one or more of a first threshold value, the damage value acting on the target object is greater than a second threshold value, and the number of enemy objects within a preset range is greater than a third threshold value.
In the present exemplary embodiment, the first threshold, the second threshold, and the third threshold may be set according to actual requirements. The blood volume value may be directly numerically represented, for example, the total blood volume value may be set to 500, the first threshold may be set to 200; the blood volume value may also be expressed as a percentage, e.g. the total blood volume value may be set to 100%, the first threshold value may be set to 30%; when the blood volume value of the target subject is less than a set first threshold value, the target subject may be considered to be in the abnormal state.
In addition to this, when the damage value acting on the target object is greater than the second threshold value, the target object may also be considered to be in a dangerous state. The damage value of the target object can be a total damage value received from the beginning of the game or an instant damage value received from a certain attack. The instant damage value will be described as an example. It should be added that different weapons, different attack patterns, and different attack distances may correspond to different damage values. When a certain target object receives an attack, a prompt identifier of an instantaneous injury value generated by the attack can be correspondingly displayed around the target object on an interactive interface. For example, the harm value of an enemy object to a target object may be represented by a percentage, and the second threshold may be set to 60%. When the instantaneous damage value applied to the target object is greater than 60%, the target object may be considered to be in a dangerous state.
Further, the target object may be considered to be in a dangerous state when the number of enemy objects within the preset range of the target object is greater than a third threshold. The preset range between the enemy object and the target object can be set to be a relatively short distance, such as 5 meters, according to actual requirements; the third threshold may be 3 or 5. The positions of the target object and all the enemy objects can be obtained according to the position sensor, and the number of the enemy objects in the preset range of the target object is determined according to the size relation between the distance from the target object to all the enemy objects and the preset range so as to judge whether the target object is in a dangerous state.
It should be noted that the target object may be considered to be in a dangerous state only by any one of the blood volume value of the target object being smaller than the first threshold value, the injury value acting on the target object being larger than the second threshold value, or the number of the enemy objects within the preset range of the target object being larger than the third threshold value. Of course, it is also possible to determine whether or not the target object is in a dangerous state by simultaneously combining two or all of the three conditions of the blood volume value, the injury value, and the number of enemy objects. By the method in the example, whether the target object is in a dangerous state can be determined quickly and accurately, so that unnecessary casualties are reduced.
Specifically, for more clearly displaying the state of the target object, providing the visual cue control corresponding to the target object may further include:
and providing an attribute identifier according to the attribute parameters of the target object.
In this example, the attribute identifier may include, for example, a blood volume bar, which may be used to represent the current blood volume value of the target object, and the blood volume bar may be, for example, a blood volume bar; and further comprises an identifier which represents the number of enemy objects in a preset range around the target object, such as an enemy number identifier and the like. For example, a blood volume bar may be provided at a preset location around the target object identification, the current value of the blood volume bar representing the blood volume value of the target object. The preset position may be, for example, any position above, below or around the target object identifier, and the shape of the blood volume bar may be rectangular or annular. Referring to fig. 5, a circular blood volume strip 406 may be provided outside the head portrait of the target object, and the blood volume strip may be divided into two parts, a completed part and an unfinished part. Wherein the total blood volume value may be represented by the whole blood volume bar, the consumed blood volume value may be represented by the completed portion, and the current remaining blood volume value may be represented by the incomplete portion. When the proportion of the unfinished portion to the entire blood volume strip is less than the first threshold, the blood volume value of the target subject may be considered to be less than the first threshold. The first threshold value may be set according to actual needs, for example, 30% of the maximum blood volume value or 100.
On this basis, in order to more intuitively display the blood volume value status of the target object, the completed part and the uncompleted part of the blood volume bar may be displayed distinctively, for example, the current remaining blood volume value is represented by black, the blood volume value that has been consumed is represented by white, or the completed part or the uncompleted part may be displayed distinctively by thickening. In addition to this, the total blood volume value as well as the remaining blood volume value may be displayed above the blood volume bar, such as 200/500; it is also possible to display only specific values of the residual blood volume value, for example 50%. When the current value of the blood volume bar is less than 30% of the set first threshold value, the target subject may be considered to be in a dangerous state.
In addition, an enemy object number identifier may be provided at a preset position of the target object identifier, for displaying the number of enemy objects within a preset range of the target object. The preset range, for example as shown in fig. 4, may display an enemy object identifier 407 around the target object identifier 403, for example, in the upper right corner or the lower right corner, and include the number of enemy objects in the enemy object identifier, so as to display the number of enemy objects around the target object more intuitively and clearly.
It should be added that, the method in this example may not only determine whether the target object is in a dangerous state, but also determine whether the virtual object corresponding to the user is in a dangerous state, and the specific determination steps are the same as those described above, and are not described here again.
Further, providing a visual cue control corresponding to the target object may further comprise:
acquiring the current positions of the target object and the virtual object, and calculating the distance between the target object and the virtual object;
and providing a distance identifier according to the distance between the target object and the virtual object, wherein a preset mapping relation exists between the display parameter of the distance identifier and the distance.
In the present exemplary embodiment, the positions of the target object and all the virtual objects may be acquired from the position sensor. And then, determining whether the virtual object or other friend objects exist in the preset monitoring range by comparing the position coordinates of the virtual object with the coordinates of the preset monitoring range. If a virtual object or a friend object is detected to exist in a preset monitoring range, the target object and the current position of the virtual object can be obtained, and the distance between the target object and the virtual object is calculated.
In this example, the detailed data of the distance of the virtual object from the target object in the dangerous state may not be displayed, but only the approximate distance between the two is exemplarily displayed. In particular, one or more distance identifiers may be provided at a preset location of each target object identifier on the interactive interface to represent the distance between the target object and the virtual object. The distance markers may be, for example, triangular icons, arrows or other shape icons, and the distance represented by each distance marker may be the same, so that the approximate distance between the virtual object and the target object in the dangerous state may be quickly obtained through the number of distance markers. For example, referring to fig. 4, each triangular distance indicator represents a distance of 5m, and there are 3 distance indicators 405 between the target object 403 in the dangerous state and the virtual object 401, so that the distance between the virtual object and the target object in the dangerous state can be roughly determined to be 15 m.
Further, the display parameters of the distance identifier may be adjusted according to the distance between the target object and the virtual object. The display parameter may be at least one of transparency, size, color, brightness, or other parameters of the distance indicator, and each display parameter may have a preset mapping relationship with the distance, where the preset mapping relationship is, for example, a positive correlation or a negative correlation. In detail:
for example, a display parameter that identifies transparency may be controlled based on the distance between the target object and the virtual object. Wherein the transparency of the distance identifier may be inversely related to the distance of the target object relative to the virtual object, i.e. the further the target object is from the virtual object, the higher the transparency of the distance identifier; the closer the target object is to the virtual object, the lower the transparency of the distance indicator. It should be noted that the range of the transparency of the distance indicator is 0% to 100%, where the distance indicator is completely transparent when the transparency is 100%, and completely opaque when the transparency is 0%.
For another example, the display parameter of the size of the distance indicator may also be controlled based on the distance of the target object relative to the virtual object. Wherein the size of the distance indicator may be inversely related to the distance of the target object relative to the virtual object. That is, the farther the target object is from the virtual object, the smaller the distance flag, and the closer the target object is to the virtual object, the larger the distance flag.
For another example, the display parameter of the brightness of the distance indicator may also be controlled based on the distance of the target object relative to the virtual object. The brightness of the distance indicator may be inversely related to the distance of the target object from the distance indicator, i.e. the farther the target object is from the virtual object, the lower the brightness of the distance indicator, and the closer the target object is to the virtual object, the higher the brightness of the distance indicator.
It should be noted that the display parameter of the color of the distance indicator may also be controlled based on the distance of the target object relative to the virtual object. For example, the farther the target object is from the virtual object, the lighter the color is from the identifier, and the closer the target object is to the virtual object, the darker the color is from the identifier. In addition, any two or more display parameters of transparency, size, color, and brightness of the distance indicator may be simultaneously controlled based on the distance of the target object with respect to the virtual object.
Referring to fig. 4, the target objects 404, 402, and 403 are sequentially arranged from near to far from the virtual object 401 within the preset monitoring range, and the distance identifiers are sequentially arranged from large to small as the distance identifier corresponding to the target object 404, the distance identifier corresponding to the target object 402, and the distance identifier corresponding to the target object 403.
Through providing visual cue control, can in time remind the virtual object to provide help to the teammate who is in the dangerous state in presetting monitoring range, the problem that the visual cue control transmission that has avoided all users to pay close attention to the condition of teammate state in the virtual map in real time in the game process lags behind, and when not opening the voice interaction function or not opening face-to-face interaction function, can in time know the information of the target object who is in dangerous state, lightened user's recreation pressure, and then reinforcing team cooperation.
Specifically, in this example embodiment, providing the visual cue control corresponding to the target object may include:
and acquiring the position of the target object and providing a position identifier corresponding to the position of the target object.
When the virtual object and the target object are in different interactive interfaces, namely different screen displays, the orientation identification corresponding to or consistent with the orientation of the target object relative to the virtual object can be provided according to the orientation of the target object relative to the virtual object, so that the orientation of the target object relative to the virtual object can be determined more accurately and more comprehensively. The orientation identifier may be, for example, an arrow, a triangle, a pointer, or another identifier with a pointing function. For example, when the target object 403 in FIG. 4 is located at the 2 o 'clock position of the virtual object 401, the triangle orientation identification 405 is provided at the 2 o' clock position of the virtual object 401; when the target object 403 is located at the 6 o 'clock position of the virtual object 401, the system also provides an orientation identification corresponding to the 6 o' clock position of the virtual object 401. In some games, a game scene is presented in a first person perspective, and a virtual lens is displayed in a user perspective, so that a virtual object operated by a user may not be displayed in the game scene, an azimuth mark corresponding to the target object is displayed in a corresponding azimuth on an interactive interface, and specific settings can be set by a developer or the user according to the type and requirements of the game.
In addition, the provided orientation mark can also display the moving direction of the target object in an identifying way, so that the virtual object or other friend objects can quickly identify the moving direction of the target object in a dangerous state. In order to save screen space and improve the reuse rate of the control or the identifier, the orientation identifier may be disposed at the same position as the distance identifier or may be the same as the distance identifier. Referring to fig. 4, if the orientation mark is the same as the distance mark 405 and the orientation mark 405 corresponding to the target object 404 points to the 3 o ' clock orientation, it can be considered that the target object 404 is at the 3 o ' clock orientation of the virtual object 401 and the moving direction of the target object 404 is the 3 o ' clock orientation.
It is to be added that providing the visual cue control corresponding to the target object may further include:
and adjusting the display position of the visual prompting control on the interactive interface according to the orientation of the target object.
Because the target object is in a moving state, the position of the target object relative to the virtual object can be obtained in real time, and the display position of the visual prompt control on the interactive interface, such as a direction mark, a distance mark, a target object blood volume value mark, for example, a blood volume bar, an enemy object number mark in a preset range of the target object, and the like, is adjusted according to the position of the target object. For example, when the target object 404 is at 3 o 'clock position of the virtual object 401 and the distance between the two is 15 meters, and the number of surrounding enemy objects is 5, the position indicator 405 points to the 3 o' clock position, the distance indicator 405 includes 3 triangular indicators, and the number of enemy objects is 5; when the target object 404 moves to the 5 o 'clock position of the virtual object 401 and the distance between the two is 10 meters and the number of surrounding enemy objects is 3, the position indicator 405 points to the 5 o' clock position, the distance indicator 405 includes 2 triangular indicators, and the number of enemy objects is 3.
In order to save screen space, in this example embodiment, the method may further include:
when the target object is detected to be located in the interactive interface presented by the virtual lens, hiding the visual cue control corresponding to the target object.
In this exemplary embodiment, if it is detected that the target object is located in the interactive interface presented by the virtual lens, that is, the target object and the virtual object within the preset monitoring range are simultaneously present on the current interactive interface, it may be considered that the virtual object and the target object are in the same game scene, and the user may intuitively obtain the related information of the target object, and at this time, the visual cue control corresponding to the target object may be hidden.
In addition, if the attribute parameter corresponding to the target object does not satisfy the trigger condition, that is, the blood volume value of the target object is not less than the first threshold, or the injury value acting on the target object is not greater than the second threshold, or the number of the enemy objects within the preset range of the target object is less than the third threshold, the target object cannot be considered to be in a dangerous state, for example, the target object has departed from a dangerous state or is in casualty, and at this time, the visual cue control corresponding to the target object may also be hidden.
Compared with the mode of displaying the target object and the virtual object information in real time in the related game application, the method provided by the example only displays the visual prompt control when the target object is in a dangerous state, so that the information quantity presented on an interactive interface is indirectly reduced, the display logic is optimized, the redundant design of the visual control is reduced, and the screen utilization rate is improved; and the information transmission is reduced, and the game fluency is improved.
It should be added that if a plurality of target objects and virtual objects are in a dangerous state at the same time, all the target objects in the dangerous state can share the position information, blood volume value information and the like of surrounding enemy objects; and if the dangerous conditions occur to a plurality of target objects and virtual objects in the same direction, displaying all the objects in the dangerous state and all the visual prompt controls in an overlapping mode according to the rendering relation.
In summary, the information prompting method provided in this example can determine more accurately whether the target object is in a dangerous state by determining whether the attribute parameter of the target object satisfies the trigger condition, so as to assist in real time; through the visual prompting control, the virtual object in the monitoring range can be reminded to provide help for teammates in a dangerous state, and when the voice interaction function is not started or the face-to-face interaction function is not started, the information of the target object in the dangerous state can be timely known, so that the game pressure of a user is reduced, and the game experience is enhanced; according to the method provided by the example, the visual prompt control is displayed only when the target object is in the dangerous state, so that the amount of information displayed on an interactive interface is reduced, the screen utilization rate is improved, and the information transmission efficiency is improved.
In an exemplary embodiment of the present disclosure, there is further provided an information prompting apparatus 600, applied to a terminal capable of presenting at least a part of a game scene and including at least a virtual object, and referring to fig. 6, the apparatus 600 may include:
the parameter determining module 601 may be configured to obtain an attribute parameter of at least one target object within a preset monitoring range, and determine whether the attribute parameter meets a trigger condition;
a control providing module 602, configured to provide a visual cue control corresponding to the target object when the attribute parameter satisfies the trigger condition.
The specific details of each functional module of the information prompting device 600 have been described in detail in the corresponding information prompting method, and therefore are not described herein again.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 800 according to this embodiment of the invention is described below with reference to fig. 7. The electronic device 800 shown in fig. 7 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention.
As shown in fig. 7, the electronic device 800 is in the form of a general purpose computing device. The components of the electronic device 800 may include, but are not limited to: the at least one processing unit 810, the at least one memory unit 820, a bus 830 connecting different system components (including the memory unit 820 and the processing unit 810), and a display unit 840.
Wherein the storage unit stores program code that is executable by the processing unit 810 to cause the processing unit 810 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, the processing unit 810 may perform the steps as shown in fig. 3: s310, acquiring an attribute parameter of at least one target object in a preset monitoring range, and judging whether the attribute parameter meets a trigger condition; and S320, providing a visual prompt control corresponding to the target object when the attribute parameters meet the trigger conditions.
The storage unit 820 may include readable media in the form of volatile memory units such as a random access memory unit (RAM)8201 and/or a cache memory unit 8202, and may further include a read only memory unit (ROM) 8203.
The storage unit 820 may also include a program/utility 8204 having a set (at least one) of program modules 8205, such program modules 8205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 830 may be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 800 may also communicate with one or more external devices 900 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 800, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 800 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 850. Also, the electronic device 800 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 860. As shown, the network adapter 860 communicates with the other modules of the electronic device 800 via the bus 830. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 800, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 8, a program product 1000 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (12)

1. An information prompting method is applied to a terminal capable of presenting an interactive interface of at least part of game scenes, and the game scenes at least comprise a virtual object, and is characterized by comprising the following steps:
acquiring attribute parameters of at least one target object in a preset monitoring range based on the position of the virtual object, and judging whether the attribute parameters meet triggering conditions; the attribute parameter satisfying the trigger condition includes: one or more of a blood volume value of the target object is smaller than a first threshold value, a damage value acting on the target object is larger than a second threshold value, and the number of enemy objects in a preset range is larger than a third threshold value;
when the attribute parameters meet the trigger conditions, providing a visual prompt control corresponding to the target object; the visual cue control comprises one or more of an orientation identifier, a distance identifier and an attribute identifier, and is used for reminding the virtual object to provide assistance operation for the target object within the preset monitoring range.
2. The information prompting method of claim 1, wherein providing a visual prompt control corresponding to the target object comprises:
and acquiring the position of the target object and providing a position identifier corresponding to the position of the target object.
3. The information prompting method of claim 2, wherein providing a visual prompt control corresponding to the target object further comprises:
acquiring the current positions of the target object and the virtual object, and calculating the distance between the target object and the virtual object;
and providing a distance identifier according to the distance between the target object and the virtual object, wherein a preset mapping relation exists between the display parameter of the distance identifier and the distance.
4. The information prompting method of claim 3, wherein the display parameters comprise one or more of transparency, color, size and brightness of the distance indicator.
5. The information prompting method of claim 1, wherein providing a visual prompt control corresponding to the target object further comprises:
and providing an attribute identifier according to the attribute parameters of the target object.
6. The information prompting method according to claim 5, wherein the attribute identifier comprises a blood volume bar and/or an enemy object number identifier, wherein the blood volume bar is used for displaying the current blood volume of the target object, and the enemy object number identifier is used for displaying the enemy object number within a preset range of the target object.
7. The information prompting method of claim 1, wherein providing a visual prompt control corresponding to the target object comprises:
and acquiring the orientation of the target object, and adjusting the display position of the visual prompt control on the interactive interface according to the orientation of the target object.
8. The information prompting method according to claim 1, characterized in that the method further comprises:
when the target object is detected to be located in the interactive interface presented by the virtual lens, hiding the visual cue control corresponding to the target object.
9. An information presentation method as claimed in claim 1, wherein the predetermined monitoring range is adjusted in real time in accordance with predetermined parameters for a plurality of stages of the overall game progress.
10. An information prompting device applied to a terminal capable of presenting an interactive interface of at least part of a game scene, wherein the game scene at least comprises a virtual object, the information prompting device is characterized by comprising:
the parameter judgment module is used for acquiring the attribute parameters of at least one target object in a preset monitoring range based on the position of the virtual object and judging whether the attribute parameters meet the triggering conditions; the attribute parameter satisfying the trigger condition includes: one or more of a blood volume value of the target object is smaller than a first threshold value, a damage value acting on the target object is larger than a second threshold value, and the number of enemy objects in a preset range is larger than a third threshold value;
the control providing module is used for providing a visual prompt control corresponding to the target object when the attribute parameters meet the trigger conditions; the visual cue control comprises one or more of an orientation identifier, a distance identifier and an attribute identifier, and is used for reminding the virtual object to provide assistance operation for the target object within the preset monitoring range.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the information prompting method according to any one of claims 1-9.
12. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the information prompting method of any one of claims 1-9 via execution of the executable instructions.
CN201711210947.5A 2017-11-28 2017-11-28 Information prompting method and device, electronic equipment and storage medium Active CN107992252B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711210947.5A CN107992252B (en) 2017-11-28 2017-11-28 Information prompting method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711210947.5A CN107992252B (en) 2017-11-28 2017-11-28 Information prompting method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN107992252A CN107992252A (en) 2018-05-04
CN107992252B true CN107992252B (en) 2020-12-22

Family

ID=62033373

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711210947.5A Active CN107992252B (en) 2017-11-28 2017-11-28 Information prompting method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN107992252B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108654086B (en) 2018-05-09 2023-06-16 腾讯科技(深圳)有限公司 Method, device and equipment for obtaining attack damage in virtual environment
CN108854069B (en) 2018-05-29 2020-02-07 腾讯科技(深圳)有限公司 Sound source determination method and device, storage medium and electronic device
CN110061892B (en) * 2018-06-13 2023-07-21 上海达龙信息科技有限公司 Method, system, storage medium and electronic device for displaying network status in real time
CN108804013B (en) * 2018-06-15 2021-01-15 网易(杭州)网络有限公司 Information prompting method and device, electronic equipment and storage medium
CN108806153A (en) * 2018-06-21 2018-11-13 北京旷视科技有限公司 Alert processing method, apparatus and system
CN109316746A (en) * 2018-09-25 2019-02-12 北京金山安全软件有限公司 Information prompting method and device in game fighting process and electronic equipment
CN109375856B (en) * 2018-10-16 2021-06-25 维沃移动通信有限公司 Information storage method and device and mobile terminal
CN109966738B (en) 2019-02-22 2020-07-31 网易(杭州)网络有限公司 Information processing method, processing device, electronic device, and storage medium
CN109806596B (en) * 2019-03-20 2023-04-07 网易(杭州)网络有限公司 Game picture display method and device, storage medium and electronic equipment
CN110465090B (en) * 2019-08-16 2023-05-30 腾讯科技(深圳)有限公司 Virtual object control method, device, terminal and storage medium
CN110433493B (en) * 2019-08-16 2023-05-30 腾讯科技(深圳)有限公司 Virtual object position marking method, device, terminal and storage medium
CN110339568B (en) * 2019-08-19 2024-06-21 网易(杭州)网络有限公司 Virtual control display method and device, storage medium and electronic device
CN110801629B (en) * 2019-11-07 2022-08-09 腾讯科技(深圳)有限公司 Method, device, terminal and medium for displaying virtual object life value prompt graph
CN111481929B (en) * 2020-05-18 2023-04-07 网易(杭州)网络有限公司 Virtual character information display method and device, electronic equipment and computer medium
CN111672111A (en) * 2020-05-28 2020-09-18 腾讯科技(深圳)有限公司 Interface display method, device, equipment and storage medium
CN112274923B (en) * 2020-10-22 2022-07-22 广州博冠信息科技有限公司 Game skill information prompting method and device, storage medium and computer equipment
CN113198178B (en) * 2021-06-03 2023-05-26 腾讯科技(深圳)有限公司 Virtual object position prompting method, device, terminal and storage medium
CN114546237A (en) * 2022-02-28 2022-05-27 维沃移动通信有限公司 Interaction method, interaction device, electronic equipment and readable storage medium
CN115501586A (en) * 2022-08-29 2022-12-23 网易(杭州)网络有限公司 Information display method and device for virtual game role and electronic device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9022864B2 (en) * 2011-06-03 2015-05-05 Nintendo Co., Ltd. Apparatus and method for controlling objects on a stereoscopic display
CN102542174B (en) * 2012-01-19 2015-12-16 马卡里 Live RPG system
CN104765520B (en) * 2014-01-06 2019-06-14 腾讯科技(深圳)有限公司 A kind of information processing method and device
JP6031471B2 (en) * 2014-05-30 2016-11-24 グリー株式会社 GAME PROGRAM, COMPUTER CONTROL METHOD, AND COMPUTER
CN106776619B (en) * 2015-11-20 2020-09-04 百度在线网络技术(北京)有限公司 Method and device for determining attribute information of target object
CN109918004B (en) * 2015-12-17 2021-04-23 网易(杭州)网络有限公司 Virtual role control method and device
CN106621321B (en) * 2016-10-14 2017-12-15 福州市马尾区朱雀网络信息技术有限公司 A kind of game object identification method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
THE PLAYER AND THE GAME: Compulsion, Relation, and Potential Space in Video Games;Kriss, A;《PSYCHOANALYTIC PSYCHOLOGY》;20161031;第571-584页 *
手机游戏中移动操控交互方式设计简析;朱英哲;《数码设计》;20170630;第8-10页 *

Also Published As

Publication number Publication date
CN107992252A (en) 2018-05-04

Similar Documents

Publication Publication Date Title
CN107992252B (en) Information prompting method and device, electronic equipment and storage medium
CN107741819B (en) Information processing method, device, electronic equipment and storage medium
CN107648848B (en) Information processing method and device, storage medium, electronic equipment
EP3939681A1 (en) Virtual object control method and apparatus, device, and storage medium
CN108287657B (en) Skill applying method and device, storage medium and electronic equipment
CN108144293A (en) Information processing method, information processing device, electronic equipment and storage medium
CN107890672B (en) Visible sensation method and device, storage medium, the electronic equipment of compensating sound information
CN107694089B (en) Information processing method, device, electronic equipment and storage medium
CN107930119B (en) Information processing method, information processing device, electronic equipment and storage medium
CN105335064B (en) A kind of information processing method and terminal
US20190060745A1 (en) Information Processing Method and Apparatus, Storage Medium, and Electronic Device
CN110115837B (en) Signal generation method, display method, device, equipment and medium in game
CN108211349B (en) Information processing method in game, electronic device and storage medium
CN109260713B (en) Virtual object remote assistance operation method and device, storage medium and electronic equipment
CN108037888B (en) Skill control method, skill control device, electronic equipment and storage medium
CN107823884A (en) Destination object determines method, apparatus, electronic equipment and storage medium
CN112933591A (en) Method and device for controlling game virtual character, storage medium and electronic equipment
CN107967096A (en) Destination object determines method, apparatus, electronic equipment and storage medium
CN107832001B (en) Information processing method, information processing device, electronic equipment and storage medium
CN107992251A (en) Technical ability control method, device, electronic equipment and storage medium
CN111760267A (en) Information sending method and device in game, storage medium and electronic equipment
CN108159697A (en) Virtual objects transfer approach and device, storage medium, electronic equipment
US20230356087A1 (en) Method for controlling virtual object to pick virtual props, terminal, and storage medium
US20230321539A1 (en) Position prompt method and apparatus for virtual object, terminal, and storage medium
CN111530065A (en) Game control method and device, computer storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant