CN113713386A - Information prompting method and device in game and touch terminal - Google Patents

Information prompting method and device in game and touch terminal Download PDF

Info

Publication number
CN113713386A
CN113713386A CN202111021651.5A CN202111021651A CN113713386A CN 113713386 A CN113713386 A CN 113713386A CN 202111021651 A CN202111021651 A CN 202111021651A CN 113713386 A CN113713386 A CN 113713386A
Authority
CN
China
Prior art keywords
game
force feedback
touch
relative direction
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111021651.5A
Other languages
Chinese (zh)
Inventor
张存君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111021651.5A priority Critical patent/CN113713386A/en
Publication of CN113713386A publication Critical patent/CN113713386A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Abstract

The invention provides an information prompting method and device in a game and a touch terminal, relates to the technical field of games, and solves the technical problem of low effectiveness of information prompting in the existing game. The method comprises the following steps: in response to a specified game event occurring in the game, determining a first relative direction of a corresponding occurrence position of the specified game event in the game scene relative to the position of the virtual character; acquiring a touch point of touch operation aiming at the graphical user interface, and determining a second relative direction relative to the touch point in the graphical user interface according to the first relative direction; wherein the first relative direction and the second relative direction are the same; and outputting a prompt of tactile force feedback within a preset range of the touch point based on the second relative direction.

Description

Information prompting method and device in game and touch terminal
Technical Field
The present application relates to the field of game technologies, and in particular, to an information prompting method and apparatus in a game, and a touch terminal.
Background
In a conventional game, it is difficult for a player to recognize a specific position of a peripheral event (an incoming person, an incoming car, a battle, or the like) in a game scene from a sound direction due to an influence of factors such as environmental confusion and a game terminal restriction. Of course, the terminal may be prompted by displaying an icon, for example, one existing solution is to indicate the sound direction on a small map by a series of icons (footprints, wheels, bullets, etc.).
However, with these existing information presentation methods, the information presentation is not easy to find, and visual attention and other operations of the player are affected, resulting in low effectiveness of the information presentation in the game.
Disclosure of Invention
The application aims to provide an information prompting method and device in a game and a touch terminal so as to solve the technical problem of low effectiveness of information prompting in the existing game.
In a first aspect, an embodiment of the present application provides an information prompting method in a game, where a terminal device provides a graphical user interface, where the graphical user interface at least includes a virtual character that is in a game scene of the game and is controlled by a first terminal device, and the method includes:
in response to a specified game event occurring in the game, determining a first relative direction of a corresponding occurrence position of the specified game event in the game scene relative to the position of the virtual character;
acquiring a touch point of touch operation aiming at the graphical user interface, and determining a second relative direction relative to the touch point in the graphical user interface according to the first relative direction; wherein the first relative direction and the second relative direction are the same;
and outputting a prompt of tactile force feedback within a preset range of the touch point based on the second relative direction.
In one possible implementation, the graphical user interface includes a touch control for controlling the virtual character; the step of obtaining a touch point for touch operation of the graphical user interface and determining a second relative direction in the graphical user interface relative to the touch point according to the first relative direction includes:
and acquiring a touch point of touch operation aiming at the touch control, and determining a second relative direction relative to the touch point in the graphical user interface according to the first relative direction.
In one possible implementation, the touch control includes any one or more of:
a movement control, an attack control, a state control, an adjustment control, and a skill control.
In one possible implementation, the method further comprises:
in response to the occurrence of the specified game event in the game, determining a target event type for the specified game event; each event type corresponds to a force feedback mode;
determining a target force feedback mode corresponding to the target event type;
and controlling the touch type force feedback mode of the prompt according to the target force feedback mode.
In one possible implementation, the force feedback mode includes any one or more of:
force feedback cadence, force feedback duration, force feedback interval duration, and force feedback frequency.
In one possible implementation, the method further comprises:
determining a level of importance of the specified game event in response to the specified game event occurring in the game;
and controlling the strength of the touch force feedback of the prompt according to the force feedback strength corresponding to the importance degree.
In one possible implementation, the importance level includes any one or more of:
the distance between the occurrence position and the virtual character position, the preset grade of the specified game event in the game and the occurrence severity of the specified game event.
In one possible implementation, the specified game event includes any one or more of:
the game event to be prompted by sound, the game event to be prompted by images and the game event to be prompted by vibration.
In one possible implementation, the specified game event comprises a game event to be audibly prompted; the method further comprises the following steps:
responding to a game event to be voice prompted in the game, and determining a target prompting voice type of the voice prompt to be voice prompted; each prompting sound type corresponds to a force feedback mode;
determining a target force feedback mode corresponding to the target prompt sound type;
and controlling the touch type force feedback mode of the prompt according to the target force feedback mode.
In one possible implementation, the specified game event comprises a game event to be audibly prompted; the method further comprises the following steps:
responding to a game event to be voice prompted in the game, and determining the prompting voice volume of the to-be-voice prompted;
and controlling the strength of the tactile force feedback of the prompt according to the force feedback corresponding to the volume of the prompt sound.
In one possible implementation, the information of the first relative direction includes: a first included angle between a first straight line and a first specific direction of the virtual character; wherein the first straight line is a connection line between the occurrence position and the virtual character position;
the information of the second relative direction includes: a second included angle between a second straight line and a second specific direction of the touch point; the second straight line is a connecting line between the position of the tactile force feedback and the touch point;
the first specific direction is the same as the second specific direction;
the first included angle is equal to the second included angle.
In one possible implementation, the step of outputting a prompt of haptic force feedback within a preset range of the touch point based on the second relative direction includes:
determining an intersection point between a ray from the touch point along the second opposite direction and a preset range boundary of the touch point as a force application point;
outputting a haptic force feedback cue at the force application point.
In a second aspect, an information prompting apparatus in a game is provided, where a terminal device provides a graphical user interface, where the graphical user interface at least includes a virtual character that is in a game scene of the game and is controlled by a first terminal device, and the apparatus includes:
the first determination module is used for responding to the occurrence of a specified game event in the game, and determining a first relative direction of the corresponding occurrence position of the specified game event in the game scene relative to the position of the virtual character;
a second determining module, configured to obtain a touch point for a touch operation of the graphical user interface, and determine, according to the first relative direction, a second relative direction in the graphical user interface relative to the touch point; wherein the first relative direction and the second relative direction are the same;
and the output module is used for outputting a prompt of tactile force feedback within the preset range of the touch point based on the second relative direction.
In a third aspect, an embodiment of the present application further provides a touch terminal, including a memory and a processor, where the memory stores a computer program that is executable on the processor, and the processor implements the method of the first aspect when executing the computer program.
In a fourth aspect, this embodiment of the present application further provides a computer-readable storage medium storing computer-executable instructions, which, when invoked and executed by a processor, cause the processor to perform the method of the first aspect.
The embodiment of the application brings the following beneficial effects:
according to the information prompting method, the information prompting device and the touch terminal in the game, the first relative direction of the occurrence position of the designated game event in the game scene relative to the position of the virtual character can be determined in response to the occurrence of the designated game event in the game, then the touch point of the touch operation aiming at the graphical user interface can be obtained, the second relative direction relative to the touch point is determined in the graphical user interface according to the first relative direction, the first relative direction and the second relative direction are the same, and therefore the prompt of the tactile force feedback is output in the preset range of the touch point based on the second relative direction. In the scheme, the relative direction of the position of the appointed game event relative to the position of the virtual character is the same as the relative direction of the tactile feedback relative to the touch control point of the touch control operation, so that the prompting of the tactile feedback can more pertinently and more effectively prompt the player the direction of the position of the appointed game event relative to the virtual character per se, the effectiveness of information prompting in the game is improved, and the technical problem of low effectiveness of information prompting in the existing game is solved. And the player can intuitively feel the prompt of the tactile force feedback through the tactile feedback, and the visual attention of the player can be still concentrated in the scene when the player feels the prompt.
Drawings
In order to more clearly illustrate the detailed description of the present application or the technical solutions in the prior art, the drawings needed to be used in the detailed description of the present application or the prior art description will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 shows a schematic structural diagram of a touch terminal provided in an embodiment of the present application;
fig. 3 is a schematic view of a usage scenario of a touch terminal according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of an information prompting method in a game according to an embodiment of the present application;
fig. 5 is a schematic diagram of a touch terminal for displaying a graphical user interface according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of another touch terminal for displaying a graphical user interface according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of another touch terminal for displaying a graphical user interface according to an embodiment of the present disclosure;
FIG. 8 is a different force feedback mode intent provided by an embodiment of the present application;
fig. 9 is a schematic view of another touch terminal for displaying a graphical user interface according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of an information prompt device in a game according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "comprising" and "having," and any variations thereof, as referred to in the embodiments of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
At present, in a hand-trip shooting game, a player is difficult to be in a quieter environment like a PC and a host computer end, and the specific position of a peripheral incident (coming person, coming vehicle, fighting and the like) is distinguished through a sound direction. The current more common solution in hand games is to represent the sound bearing on a small map by a series of icons (footprints, wheels, bullets, etc.). However, the player is difficult to hear the sound and distinguish the position, and the direction of the map is not consistent with the direction facing the player, so that the player needs to observe and think himself/herself when trying to find the specific position of the sound source in the visual field, and the position of the visual focus needs to be switched back and forth between a small map and a 3D scene, which is inconvenient.
It can be known from the above defects that the information prompt is not easy to be found in the existing information prompt method, which results in low effectiveness of information prompt in the game.
Based on this, the embodiments of the present application provide an in-game information prompting method, an in-game information prompting device, and a touch terminal, in which the method can replace prompting such as sound and icons through haptic directional force feedback, and can alleviate the technical problem of low effectiveness of the existing in-game information prompting.
In one embodiment of the present application, the information prompting method in the game may be executed on a local terminal device or a server. When the information prompting method in the game is operated on the server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the information prompting method in the game are finished on a cloud game server, and the client equipment is used for receiving and sending data and presenting the game picture, for example, the client equipment can be display equipment with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an optional implementation manner, taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, downloading and installing a game program through a touch device and running the game program conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
In a possible implementation manner, an embodiment of the present application provides an information prompting method in a game, where a graphical user interface is provided through a terminal device, where the terminal device may be the aforementioned local terminal device, and may also be the aforementioned client device in a cloud interaction system.
For example, as shown in fig. 1, fig. 1 is a schematic view of an application scenario provided in the embodiment of the present application. The application scenario may include a touch terminal (e.g., a cell phone 102) and a server 101, and the touch terminal may communicate with the server 101 through a wired network or a wireless network. The touch terminal is used for operating a virtual desktop, and can interact with the server 101 through the virtual desktop to control a virtual object in the server 101.
The touch terminal of the present embodiment is described by taking the mobile phone 102 as an example. The cell phone 102 may include user input sensors, such as force sensors, touch sensors, motion sensors, and other input components. To provide output to a user, the device may have visual output components, such as a display, audio output components, and tactile output components. The haptic output component may apply a force in a given direction against a surface of the device housing, either a sidewall surface or other device surface. Control circuitry in the device may direct the haptic output members to generate a force in a direction perpendicular to or tangential to the surface of the housing. The force may be provided as feedback when the control circuitry directs the display to provide visual content to the user based on the user input. The touch screen is based on an electrical vibration (electrical vibration) principle, periodic static electricity generated in operation can push a finger in a contact process, and the touch screen is simply an illusion that electronic equipment with directional tactile output touches a real object.
The handset 102 also includes Radio Frequency (RF) circuitry 210, memory 220, a touch screen 230, a processor 240, and the like. Those skilled in the art will appreciate that the handset configuration shown in fig. 2 is not intended to be limiting and may include more or fewer components than those shown, or may combine certain components, or split certain components, or arranged in different components. Those skilled in the art will appreciate that the touch screen 230 is part of a User Interface (UI) and that the cell phone 102 may include fewer than or the same User Interface as illustrated.
The RF circuitry 210 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global system for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 220 may be used for storing software programs and modules, and the processor 240 executes various functional applications and data processing of the cellular phone 102 by operating the software programs and modules stored in the memory 220. The memory 220 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the stored data area may store data created from use of the handset 102, and the like. Further, the memory 220 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The touch screen 230 may be used to display a graphical user interface and receive user operations with respect to the graphical user interface. A particular touch screen 230 may include a display panel and a touch panel. The display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may collect contact or non-contact operations of a user on or near the touch panel (for example, as shown in fig. 3, operations of the user on or near the touch panel using any suitable object or accessory such as a finger 301, a stylus pen, etc.), and generate preset operation instructions. In addition, the touch panel may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction and gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into information that can be processed by the processor, sends the information to the processor 240, and receives and executes commands sent from the processor 240. In addition, the touch panel may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, a surface acoustic wave, and the like, and may also be implemented by any technology developed in the future. Further, the touch panel may cover the display panel, a user may operate on or near the touch panel covered on the display panel according to a graphical user interface displayed by the display panel, the touch panel detects an operation thereon or nearby and transmits the operation to the processor 240 to determine a user input, and the processor 240 provides a corresponding visual output on the display panel in response to the user input. In addition, the touch panel and the display panel can be realized as two independent components or can be integrated.
The processor 240 is the control center of the handset 102, connects various parts of the entire handset using various interfaces and lines, and performs various functions and processes of the handset 102 by running or executing software programs and/or modules stored in the memory 220 and calling data stored in the memory 220, thereby performing overall monitoring of the handset.
Embodiments of the present application are further described below with reference to the accompanying drawings.
Fig. 4 is a schematic flow chart of an information prompting method in a game according to an embodiment of the present application. The method can be applied to a terminal device (such as the mobile phone 102 shown in fig. 2) capable of presenting a graphical user interface, and the graphical user interface is provided through the terminal device, and at least a virtual character controlled by a first terminal device in a game scene of the game is included in the graphical user interface. As shown in fig. 4, the method includes:
in response to the occurrence of the designated game event in the game, a first relative direction of the corresponding occurrence position of the designated game event in the game scene with respect to the position of the virtual character is determined in step S410.
Illustratively, as shown in fig. 5, the virtual character 501 is a player-controlled virtual character, an enemy 502 is approaching at a far position, the event that the enemy 502 is approaching can be one of the designated game events, a line connecting the position of the event and the position of the virtual character is shown by a dashed line 504, the virtual character faces a direction shown by a dashed line 505, and an angle between the two directions is shown by an angle 503, so that a first relative direction of the corresponding position of the designated game event in the game scene with respect to the position of the virtual character can be determined.
In step S420, a touch point for a touch operation of the graphical user interface is obtained, and a second relative direction with respect to the touch point is determined in the graphical user interface according to the first relative direction.
Wherein the first relative direction and the second relative direction are the same.
For example, as shown in fig. 6, when the system detects that a finger touches the gui, the touch point 601 is located as a small circle with a hatching, and a second relative direction with respect to the touch point of the touch operation, i.e. the direction indicated by the dashed line 602, can be determined according to the first relative direction, and the angle of the angle 603 is equal to the angle of the angle 503 in fig. 5.
And step S430, outputting a prompt of tactile force feedback within a preset range of the touch point based on the second relative direction.
For example, as shown in fig. 6, a circular area 604 is set up in the graphical user interface with the touch point 601 as the center, and the circular area 604 may be a preset range, so as to output a tactile force feedback prompt within the range.
In practical application, when sound appears around the virtual character, the fingers of the player are placed in a preset range, the force from the sound source direction can be sensed, the force and the sound are kept consistent, the force modes of different types of sound single reminding are different, and the small icon prompt on the map is still kept.
It should be noted that the preset range may be a relatively large area range on the graphical user interface, or may be a relatively small area range on the graphical user interface. The predetermined range may be a square, a rectangle, a ring, or other shapes (e.g., a circle, etc.), and the exemplary embodiment is not limited thereto.
In the embodiment of the application, the relative direction of the position of the game event relative to the position of the virtual character is appointed, and the relative direction of the touch type force feedback relative to the touch control operation touch control point is the same, so that the prompt of the touch type force feedback can more pertinently and more effectively prompt the player the direction of the position of the game event relative to the virtual character per se, the effectiveness of information prompt in the game is improved, and the technical problem of low effectiveness of information prompt in the existing game is solved.
The above steps are described in detail below.
In some embodiments, the preset range may be a display range of a touch control in the graphical user interface, so that the player receives a prompt for tactile force feedback while controlling the virtual character through operation. As an example, a touch control for controlling the virtual character is included in the graphical user interface; the step S420 may specifically include the following steps:
step a), acquiring a touch point of touch operation aiming at the touch control, and determining a second relative direction relative to the touch point in the graphical user interface according to the first relative direction.
For example, as shown in fig. 6, there is a touch control in the gui, the touch control may be overlapped with the circular area 604, after the player touches the touch control, the system detects that the finger touches the gui, the position of the touch point 601 is shown as a shaded small circle, a second relative direction of the touch point relative to the touch operation, i.e. the direction indicated by the dashed line 602, may be determined according to the first relative direction, and the angle of the angle 603 is equal to the angle of the angle 503 in fig. 5.
If the player clicks the area of the graphical user interface not containing the touch control to receive the prompt of the tactile force feedback, for example, the player clicks the preset area with a thumb, the player cannot operate the thumb while receiving the prompt, and the game operation of the player is affected. The display range of the touch control in the graphical user interface is set to be a preset range, so that a player can receive the prompt of tactile force feedback while operating, the operation of the player is not influenced, and the game experience of the player is improved.
Based on the step a), the touch control can comprise a plurality of types, so that the output of the tactile force feedback prompt is more flexible. As an example, a touch control includes any one or more of:
a movement control, an attack control, a state control, an adjustment control, and a skill control.
Illustratively, as shown in fig. 7, there are a variety of touch controls in the graphical user interface, including but not limited to the illustrated skill control 701, movement control 702, status control 703, and attack control 704.
In practical application, due to the fact that game playing methods are various, the types of touch control pieces in the graphical user interface are multiple, and the occupied total area of the graphical user interface is not small, the display range of the touch control pieces can be set to be a preset range, the purpose that the prompt of the tactile force feedback is output in a larger range is achieved, a player can operate while receiving the prompt of the tactile force feedback, the operation is not influenced mutually, and the game experience of the player is improved.
In some embodiments, different game event types may correspond to different target force feedback modes to enable providing a player with a variety of different cues that are differentiated. As an example, the method may further comprise the steps of:
step b), responding to the occurrence of a specified game event in the game, and determining the target event type of the specified game event;
step c), determining a target force feedback mode corresponding to the target event type;
and d), controlling the touch type force feedback mode of the prompt according to the target force feedback mode.
For step b) above, each event type therein corresponds to a force feedback pattern.
For example, there may be multiple target event types in the game, such as other avatars moving on foot nearby, teammates competing with enemies nearby, and so forth. Different corresponding target force feedback modes can be set according to the target event type of the specified game event. For example, for other virtual characters moving on foot nearby, the corresponding force feedback mode may be a slow-paced application of force; for teammates and enemies to engage nearby, the corresponding force feedback mode may be a rhythmically jerky application of force.
Different target force feedback modes corresponding to different target event types are set, so that the prompt of the tactile force feedback has distinction degree, a player can distinguish what event happens through the prompt, and more game information can be timely and accurately provided for the player.
Based on the steps b), c) and d), the force feedback mode can comprise multiple modes, and more flexible prompt with discrimination can be realized by controlling the rhythm, duration and the like of the force. As one example, the force feedback mode includes any one or more of:
force feedback cadence, force feedback duration, force feedback interval duration, and force feedback frequency.
For example, as shown in fig. 8, force feedback modes corresponding to different event types are shown, wherein a solid line represents a force application time length, a gap represents a force non-application time length, and different modes formed by different interval ratios can be understood as force feedback modes of different frequencies. For example, a high-frequency force feedback mode is corresponding to the engagement of other virtual characters nearby; a force feedback mode with medium frequency corresponding to the hiking movement of other virtual characters is arranged nearby; there are other virtual vehicle movements in the vicinity corresponding to low frequency force feedback modes.
By controlling the force feedback rhythm, the force feedback duration, the force feedback interval duration, the force feedback frequency and the like, different target event types have unique target force feedback modes corresponding to the different target event types, the prompt of the tactile force feedback has distinction degree, a player can distinguish what event occurs through the prompt, and more game information can be accurately provided for the player.
In some embodiments, the importance of the game event may correspond to the strength of the prompted haptic force feedback, e.g., the strength of the prompted haptic force feedback may be adjusted according to the importance of the game event. As an example, the method may further comprise the steps of:
step e), responding to the occurrence of the specified game event in the game, and determining the importance degree of the specified game event;
and f), controlling the magnitude of the prompted tactile force feedback according to the force feedback force corresponding to the importance degree.
Illustratively, for both events "nearby enemies belonging to different camps are engaged" and "nearby teammates and enemies are engaged", this is attributed to the same event type, i.e., "other nearby avatars are engaged". However, the importance of teammates competing with enemies is significantly higher than the importance of enemy competing with each other. Therefore, the strength of the prompted tactile force feedback can be controlled, so that the tactile force feedback has distinction degree, namely the strength of the tactile force feedback corresponding to the engagement between the teammate and the enemy can be larger, and the strength of the tactile force feedback corresponding to the engagement between the enemy and the enemy can be smaller.
The method has the advantages that the strength of the prompted tactile force feedback is adjusted according to the importance degree of the appointed game event and the force feedback strength corresponding to the importance degree, so that the player can distinguish the importance degree of the game event, more comprehensive information in a game office is provided for the player, the player can accurately acquire game-matching information, strategic arrangement of a game is performed according to related information, and the game experience of the player is improved.
Based on the steps e) and f), the importance degree of the game event can comprise various degrees, so that the setting of the game prompt is more flexible. As an example, the importance level includes any one or more of:
the distance between the occurrence position and the position of the virtual character, the preset grade of the appointed game event in the game and the occurrence severity of the appointed game event.
Illustratively, for both events "nearby enemies belonging to different camps are engaged" and "nearby teammates and enemies are engaged", this is attributed to the same event type, i.e., "other nearby avatars are engaged". However, the importance level of "nearby teammates are competing with enemies" is higher than the importance level of "nearby enemies belonging to different camps are competing". Therefore, the strength of the tactile force feedback corresponding to the engagement of the teammates and the enemies can be controlled to be larger, and the strength of the tactile force feedback corresponding to the engagement of the enemies can be controlled to be smaller.
As another example, for two events, "other avatars are moving within 10 meters of proximity" and "other avatars are moving within 20 meters of proximity", it is attributed to the same event type, i.e., the event type "other avatars are moving in proximity". However, the predetermined level of "other avatars moving within 10 meters of proximity" is higher than the predetermined level of "other avatars moving within 20 meters of proximity" because the other avatars in close proximity are more threatening. Therefore, the strength of the tactile force feedback corresponding to the fact that other virtual characters move within 10 meters nearby can be controlled to be larger, and the strength of the tactile force feedback corresponding to the fact that other virtual characters move within 20 meters nearby can be controlled to be smaller.
As another example, for the type of event that "teammates are injured," the severity of their "50% loss of life" is higher than the severity of their "10% loss of life". Therefore, the strength of the tactile force feedback corresponding to the '50% loss life value' of the teammate can be controlled to be larger, and the strength of the tactile force feedback corresponding to the '10% loss life value' of the teammate can be controlled to be smaller.
The importance degree of the designated game event is divided according to the distance between the position of the designated game event and the position of the virtual character, the preset level of the designated game event in the game, the occurrence severity of the designated game event and other conditions, so that the tactile force feedback prompt with different force degrees is performed, the prompt has the distinction degree, and a player can acquire the details of the related information.
In some embodiments, the designated game events may include a variety such that the haptic force feedback cues can be adapted to a variety of game events. As one example, the specified game event includes any one or more of:
the game event to be prompted by sound, the game event to be prompted by images and the game event to be prompted by vibration.
For example, "other virtual characters moving on foot nearby" may be used as a game event to be audibly prompted, and when other virtual characters moving on foot nearby, a footstep may be used as a prompt to prompt the player that other virtual characters moving on foot nearby are present.
As another example, "other virtual characters are nearby to engage" may be used as a game event to be graphically prompted, and when other virtual characters are nearby to engage, a bullet image may be provided on a small map in the graphical user interface as a prompt to indicate to the player which direction other virtual characters are in engagement.
As another example, "there is a virtual vehicle moving nearby" may be used as a game event to be prompted by vibration, and when there is a virtual vehicle moving nearby, the game device may vibrate as a prompt to prompt the player that there is a virtual vehicle moving nearby.
The information prompting method is adaptive to various game events, so that a player can acquire various information in a game office only by touching the screen with fingers, the player can still acquire the information in the game office under the condition that the player cannot hear game sound, the player can be concentrated in the content in the game main picture, the phenomenon that the attention is dispersed due to the observation of a small map is avoided, and the game experience of the player is improved.
In some embodiments, for game events to be audibly prompted, the target prompt sound type for the audible prompt corresponds to the haptic force feedback mode of the prompt. As one example, the specified game event includes a game event to be audibly prompted; the method may further comprise the steps of:
step g), in response to a game event to be subjected to voice prompt in a game, determining a target prompt voice type to be subjected to voice prompt;
step h), determining a target force feedback mode corresponding to the target prompt sound type;
and step i), controlling the touch type force feedback mode of the prompt according to the target force feedback mode.
For step g) above, each cue sound type corresponds to a force feedback mode.
For example, as shown in fig. 8, force feedback modes corresponding to different target prompt sound types to be audibly prompted are shown, where a solid line represents a force application time length, and a gap represents a time length without force application, and different modes are formed by different interval ratios, which can be understood as force application modes of different frequencies. For example, the gunshot corresponds to a high-frequency force application mode; the footstep sound corresponds to a force application mode with medium frequency; the car sound corresponds to a low-frequency real force mode.
By controlling the rhythm of force feedback, the duration of force feedback, the interval duration of force feedback, the frequency of force feedback and the like, different target prompt sound types to be prompted by sound have unique target force feedback modes corresponding to the target prompt sound types, the prompts of tactile force feedback have discrimination, sound prompt is replaced more comprehensively and perfectly, even if a player cannot hear game sound, the player can still distinguish what event happens through the prompts of tactile force feedback, and more game information can be timely and accurately provided for the player.
In some embodiments, for a game event to be audibly prompted, the prompt sound volume of the to-be-audibly prompted corresponds to the magnitude of the haptic force feedback of the prompt. As one example, the specified game event includes a game event to be audibly prompted; the method may further comprise the steps of:
step j), responding to a game event to be voice-prompted in the game, and determining the prompting voice volume of the voice prompt to be voice-prompted;
and k), controlling the magnitude of the prompted tactile force feedback according to the force feedback force corresponding to the prompted sound volume.
Illustratively, for both events, "other avatars within 10 meters of proximity are moving" and "other avatars within 20 meters of proximity are moving", the footfall of movements of other avatars within 10 meters of proximity is higher than the footfall of movements of other avatars within 20 meters of proximity. Therefore, the strength of the tactile force feedback corresponding to the fact that other virtual characters move within 10 meters nearby can be controlled to be larger, and the strength of the tactile force feedback corresponding to the fact that other virtual characters move within 20 meters nearby can be controlled to be smaller.
Through the suggestion sound volume according to treating the sound prompt, according to the power feedback dynamics that suggestion sound volume corresponds, the power size of the tactile power feedback of control suggestion, thereby carry out the tactile power feedback suggestion of different dynamics, make the suggestion have the discrimination, make the player just can feel the size of sound according to the difference of dynamics, with the more complete replacement sound suggestion, even make the player can't hear under the condition of recreation sound, the size of sound still can be distinguished through the power size of tactile power feedback, can be timely, accurate provide more comprehensive information in the game office for the player.
In some embodiments, the direction of the sound source relative to the virtual character in the game is consistent with the direction of the tactile force relative to the touch point, and the relative direction can be represented by an included angle, so that the accuracy of the relative direction and the convenience of data processing are improved. As an example, the information of the first relative direction includes: a first included angle between the first straight line and a first specific direction of the virtual character; wherein, the first straight line is a connecting line between the occurrence position and the virtual character position;
the information of the second relative direction includes: a second included angle between the second straight line and a second specific direction of the touch point; the second straight line is a connecting line between the position of the tactile force feedback and the touch point;
the first specific direction is the same as the second specific direction;
the first included angle is equal to the second included angle.
Illustratively, as shown in fig. 5 and 6, the virtual character 501 is a player-controlled virtual character, an enemy 502 is approaching at a far position, the enemy 502 will generate footsteps due to movement, the line connecting the sound source position and the virtual character position is shown by a broken line 504, the virtual character faces a direction as shown by a broken line 505, and the angle between the two directions is shown by an angle 503, so that a first relative direction of the corresponding position of the sound source in the game scene with respect to the virtual character position can be determined. As shown in FIG. 6, when the system detects that a finger touches the GUI, the touch point 601 is located as a small circle with a hatching, and a second relative direction of the touch point with respect to the touch operation, i.e. the direction indicated by the dashed line 602, can be determined according to the first relative direction, and the angle of the angle 603 is equal to the angle of the angle 503 in FIG. 5.
In actual practice, as shown in fig. 9, the virtual character 901 is a character controlled by a player, and the virtual character 902 is a sound source that produces footsteps. Firstly, the system detects sounds generated around the game scene, and records the sound source types, the sound source positions and the included angles Ri of the positions of the players, which are connected with the current player facing direction, of all the sounds according to the prior art, and the size Vi of the sounds which can be received. Then, the system detects that the finger presses the touch control, and applies force to the finger with the contact point 903 as the center of a circle. An included angle between the direction of the force and the vertical direction of the touch control is Ri, and the magnitude Fi of the applied force is a Vi (a is a preset constant).
By enabling the direction of the feedback force to be consistent and corresponding to the direction of the sound source in the game scene, the tactile force feedback prompt can accurately prompt the direction of the sound source, so that a player can accurately acquire the relevant information of the game event to be prompted by sound in a game office.
In some embodiments, the haptic force may be transmitted from the preset range boundary to the touch point in a second relative direction, facilitating the player to accurately determine the direction from the point of application. As an example, the step S430 may specifically include the following steps:
step l), determining an intersection point between a ray along a second relative direction from the touch point and a preset range boundary of the touch point as a force application point;
and m), outputting a prompt of tactile force feedback at the force application point.
Illustratively, as shown in fig. 9, an intersection 904 between a ray from the touch point along the second relative direction and a preset range boundary of the touch point is an application point, the force pattern is a vibration at the application point, the force patterns of different sound source types are different (represented by force duration and interval time), and the different force patterns are mainly characterized by the duration and interval of the application of force. And if the fingers of the player are simultaneously applied with forces from different directions, the player can clearly distinguish various prompts provided by a plurality of forces because the different points of application are not influenced mutually.
Fig. 10 provides a schematic diagram of the structure of an in-game information presentation apparatus 1000. The device can be applied to a touch terminal capable of running a game program, and the graphical user interface at least comprises a virtual character which is in a game scene of the game and is controlled by first terminal equipment. As shown in fig. 10, the in-game information presentation apparatus 1000 includes:
a first determining module 1001, configured to determine, in response to a specific game event occurring in a game, a first relative direction of a corresponding occurrence position of the specific game event in a game scene with respect to a position of a virtual character;
a second determining module 1002, configured to obtain a touch point for a touch operation of the graphical user interface, and determine a second relative direction with respect to the touch point in the graphical user interface according to the first relative direction; wherein the first relative direction and the second relative direction are the same;
and an output module 1003, configured to output a prompt of the haptic force feedback within a preset range of the touch point based on the second relative direction.
In some embodiments, the graphical user interface includes a touch control for controlling the virtual character; the second determining module 1002 is specifically configured to:
and acquiring a touch point of touch operation aiming at the touch control, and determining a second relative direction relative to the touch point in the graphical user interface according to the first relative direction.
In some embodiments, the touch control comprises any one or more of:
a movement control, an attack control, a state control, an adjustment control, and a skill control.
In some embodiments, the apparatus further comprises:
a third determining module, configured to determine a target event type of a specified game event in response to the occurrence of the specified game event in the game; each event type corresponds to a force feedback mode;
the fourth determining module is used for determining a target force feedback mode corresponding to the target event type;
the first control module is used for controlling the prompted tactile force feedback mode according to the target force feedback mode.
In some embodiments, the force feedback mode includes any one or more of:
force feedback cadence, force feedback duration, force feedback interval duration, and force feedback frequency.
In some embodiments, the apparatus further comprises:
a fifth determining module, configured to determine an importance degree of a specified game event in response to the occurrence of the specified game event in the game;
and the second control module is used for controlling the strength of the tactile force feedback of the prompt according to the force feedback strength corresponding to the importance degree.
In some embodiments, the degree of importance includes any one or more of:
the distance between the occurrence position and the position of the virtual character, the preset grade of the appointed game event in the game and the occurrence severity of the appointed game event.
In some embodiments, the specified game event comprises any one or more of:
the game event to be prompted by sound, the game event to be prompted by images and the game event to be prompted by vibration.
In some embodiments, the specified game event comprises a game event to be audibly prompted; the device also includes:
the sixth determining module is used for responding to a game event to be subjected to voice prompt in the game and determining a target prompt voice type to be subjected to voice prompt; each prompting sound type corresponds to a force feedback mode;
the seventh determining module is used for determining a target force feedback mode corresponding to the target prompt sound type;
and the third control module is used for controlling the prompted tactile force feedback mode according to the target force feedback mode.
In some embodiments, the specified game event comprises a game event to be audibly prompted; the device also includes:
the eighth determining module is used for responding to a game event to be prompted by sound in a game and determining the prompting sound volume to be prompted by the sound;
and the fourth control module is used for controlling the strength of the tactile force feedback of the prompt according to the force feedback corresponding to the prompt sound volume.
In some embodiments, the information of the first relative direction comprises: a first included angle between the first straight line and a first specific direction of the virtual character; wherein, the first straight line is a connecting line between the occurrence position and the virtual character position;
the information of the second relative direction includes: a second included angle between the second straight line and a second specific direction of the touch point; the second straight line is a connecting line between the position of the tactile force feedback and the touch point;
the first specific direction is the same as the second specific direction;
the first included angle is equal to the second included angle.
In some embodiments, the output module 1003 is specifically configured to:
determining the intersection point between the ray along the second relative direction from the touch point and the preset range boundary of the touch point as a force application point;
a cue of tactile force feedback is output at the force application point.
The in-game information presentation device provided by the embodiment of the present application has the same technical features as the in-game information presentation method provided by the above embodiment, and therefore, the same technical problems can be solved, and the same technical effects can be achieved.
Corresponding to the information prompting method in the game, an embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium stores computer-executable instructions, and when the computer-executable instructions are called and executed by a processor, the computer-executable instructions cause the processor to execute the steps of the information prompting method in the game.
The information prompting device provided by the embodiment of the application can be specific hardware on the equipment, or software or firmware installed on the equipment, and the like. The device provided by the embodiment of the present application has the same implementation principle and technical effect as the foregoing method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the foregoing method embodiments where no part of the device embodiments is mentioned. It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the foregoing systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
For another example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments provided in the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or a part of the technical solution may be essentially implemented in the form of a software product, which is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the information prompting method in the game according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus once an item is defined in one figure, it need not be further defined and explained in subsequent figures, and moreover, the terms "first", "second", "third", etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the scope of the embodiments of the present application. Are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. An information prompting method in a game is characterized in that a terminal device provides a graphical user interface, and the graphical user interface at least comprises a virtual character which is in a game scene of the game and is controlled by a first terminal device, and the method comprises the following steps:
in response to a specified game event occurring in the game, determining a first relative direction of a corresponding occurrence position of the specified game event in a game scene relative to the position of the virtual character;
acquiring a touch point of touch operation aiming at the graphical user interface, and determining a second relative direction relative to the touch point in the graphical user interface according to the first relative direction; wherein the first relative direction and the second relative direction are the same;
and outputting a prompt of tactile force feedback within a preset range of the touch point based on the second relative direction.
2. The method of claim 1, wherein the graphical user interface includes a touch control for controlling the virtual character; the step of obtaining a touch point for touch operation of the graphical user interface and determining a second relative direction in the graphical user interface relative to the touch point according to the first relative direction includes:
and acquiring a touch point of touch operation aiming at the touch control, and determining a second relative direction relative to the touch point in the graphical user interface according to the first relative direction.
3. The method of claim 2, wherein the touch control comprises any one or more of:
a movement control, an attack control, a state control, an adjustment control, and a skill control.
4. The method of claim 1, further comprising:
in response to the occurrence of the specified game event in the game, determining a target event type for the specified game event; each event type corresponds to a force feedback mode;
determining a target force feedback mode corresponding to the target event type;
and controlling the touch type force feedback mode of the prompt according to the target force feedback mode.
5. The method of claim 4, wherein the force feedback mode comprises any one or more of:
force feedback cadence, force feedback duration, force feedback interval duration, and force feedback frequency.
6. The method of claim 1, further comprising:
determining a level of importance of the specified game event in response to the specified game event occurring in the game;
and controlling the strength of the touch force feedback of the prompt according to the force feedback strength corresponding to the importance degree.
7. The method of claim 6, wherein the importance level comprises any one or more of:
the distance between the occurrence position and the virtual character position, the preset grade of the specified game event in the game and the occurrence severity of the specified game event.
8. The method of claim 1, wherein the specified game event comprises any one or more of:
the game event to be prompted by sound, the game event to be prompted by images and the game event to be prompted by vibration.
9. The method of claim 1, wherein the specified game event comprises a game event to be audibly prompted; the method further comprises the following steps:
responding to a game event to be voice prompted in the game, and determining a target prompting voice type of the voice prompt to be voice prompted; each prompting sound type corresponds to a force feedback mode;
determining a target force feedback mode corresponding to the target prompt sound type;
and controlling the touch type force feedback mode of the prompt according to the target force feedback mode.
10. The method of claim 1, wherein the specified game event comprises a game event to be audibly prompted; the method further comprises the following steps:
responding to a game event to be voice prompted in the game, and determining the prompting voice volume of the to-be-voice prompted;
and controlling the strength of the tactile force feedback of the prompt according to the force feedback corresponding to the volume of the prompt sound.
11. The method of claim 1, wherein the information of the first relative direction comprises: a first included angle between a first straight line and a first specific direction of the virtual character; wherein the first straight line is a connection line between the occurrence position and the virtual character position;
the information of the second relative direction includes: a second included angle between a second straight line and a second specific direction of the touch point; the second straight line is a connecting line between the position of the tactile force feedback and the touch point;
the first specific direction is the same as the second specific direction;
the first included angle is equal to the second included angle.
12. The method of claim 1, wherein the step of outputting the indication of the haptic force feedback within the preset range of touch points based on the second relative direction comprises:
determining an intersection point between a ray from the touch point along the second opposite direction and a preset range boundary of the touch point as a force application point;
outputting a haptic force feedback cue at the force application point.
13. An information prompting device in a game, characterized in that a terminal device provides a graphical user interface, the graphical user interface at least comprises a virtual character which is in a game scene of the game and is controlled by a first terminal device, the device comprises:
the first determination module is used for responding to the occurrence of a specified game event in the game, and determining a first relative direction of the corresponding occurrence position of the specified game event in the game scene relative to the position of the virtual character;
a second determining module, configured to obtain a touch point for a touch operation of the graphical user interface, and determine, according to the first relative direction, a second relative direction in the graphical user interface relative to the touch point; wherein the first relative direction and the second relative direction are the same;
and the output module is used for outputting a prompt of tactile force feedback within the preset range of the touch point based on the second relative direction.
14. Touch terminal comprising a memory and a processor, wherein the memory stores a computer program operable on the processor, wherein the processor implements the steps of the method according to any of the claims 1 to 12 when executing the computer program.
15. A computer readable storage medium having stored thereon computer executable instructions which, when invoked and executed by a processor, cause the processor to execute the method of any of claims 1 to 12.
CN202111021651.5A 2021-09-01 2021-09-01 Information prompting method and device in game and touch terminal Pending CN113713386A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111021651.5A CN113713386A (en) 2021-09-01 2021-09-01 Information prompting method and device in game and touch terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111021651.5A CN113713386A (en) 2021-09-01 2021-09-01 Information prompting method and device in game and touch terminal

Publications (1)

Publication Number Publication Date
CN113713386A true CN113713386A (en) 2021-11-30

Family

ID=78680686

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111021651.5A Pending CN113713386A (en) 2021-09-01 2021-09-01 Information prompting method and device in game and touch terminal

Country Status (1)

Country Link
CN (1) CN113713386A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114860148A (en) * 2022-04-19 2022-08-05 北京字跳网络技术有限公司 Interaction method, interaction device, computer equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107899241A (en) * 2017-11-22 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN109745702A (en) * 2018-12-28 2019-05-14 北京金山安全软件有限公司 Information prompting method and device
CN110740214A (en) * 2019-10-22 2020-01-31 维沃移动通信(杭州)有限公司 prompting method, terminal and computer readable storage medium
US20200353361A1 (en) * 2018-05-29 2020-11-12 Tencent Technology (Shenzhen) Company Limited Positioning information prompting method and apparatus, storage medium, and electronic device
CN112245912A (en) * 2020-11-11 2021-01-22 腾讯科技(深圳)有限公司 Sound prompting method, device, equipment and storage medium in virtual scene
CN113181632A (en) * 2021-04-22 2021-07-30 网易(杭州)网络有限公司 Information prompting method and device, storage medium and computer equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107899241A (en) * 2017-11-22 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
US20200353361A1 (en) * 2018-05-29 2020-11-12 Tencent Technology (Shenzhen) Company Limited Positioning information prompting method and apparatus, storage medium, and electronic device
CN109745702A (en) * 2018-12-28 2019-05-14 北京金山安全软件有限公司 Information prompting method and device
CN110740214A (en) * 2019-10-22 2020-01-31 维沃移动通信(杭州)有限公司 prompting method, terminal and computer readable storage medium
CN112245912A (en) * 2020-11-11 2021-01-22 腾讯科技(深圳)有限公司 Sound prompting method, device, equipment and storage medium in virtual scene
CN113181632A (en) * 2021-04-22 2021-07-30 网易(杭州)网络有限公司 Information prompting method and device, storage medium and computer equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114860148A (en) * 2022-04-19 2022-08-05 北京字跳网络技术有限公司 Interaction method, interaction device, computer equipment and storage medium
CN114860148B (en) * 2022-04-19 2024-01-16 北京字跳网络技术有限公司 Interaction method, device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109529319B (en) Display method and device of interface control and storage medium
US10821360B2 (en) Data processing method and mobile terminal
CN111035918B (en) Reconnaissance interface display method and device based on virtual environment and readable storage medium
CN111399639B (en) Method, device and equipment for controlling motion state in virtual environment and readable medium
US11270087B2 (en) Object scanning method based on mobile terminal and mobile terminal
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN112044067A (en) Interface display method, device, equipment and storage medium
CN108153475B (en) Object position switching method and mobile terminal
JP5687380B1 (en) Terminal device
CN113713386A (en) Information prompting method and device in game and touch terminal
US20210316210A1 (en) Game system, processing method, and information storage medium
CN112704876B (en) Method, device and equipment for selecting virtual object interaction mode and storage medium
CN110841288B (en) Prompt identifier eliminating method, device, terminal and storage medium
CN116531754A (en) Method and device for controlling virtual characters in game and electronic terminal
CN111558226A (en) Method, device, equipment and storage medium for detecting abnormal operation behaviors
CN115105831A (en) Virtual object switching method and device, storage medium and electronic device
CN113318429B (en) Control method and device for exiting game, processor and electronic device
CN111338487B (en) Feature switching method and device in virtual environment, terminal and readable storage medium
CN115089959A (en) Direction prompting method and device in game and electronic terminal
CN113680062A (en) Information viewing method and device in game
CN112316423B (en) Method, device, equipment and medium for displaying state change of virtual object
JP2016024803A (en) Terminal device
WO2020179666A1 (en) Information processing program, information processing method, information processing device, and information processing system
JP7408685B2 (en) Methods, devices, equipment and storage media for adjusting the position of controls in application programs
CN113209609B (en) Interaction method, device, equipment and medium based on card objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination