CN113559509A - Information prompting method and device in game, electronic equipment and storage medium - Google Patents

Information prompting method and device in game, electronic equipment and storage medium Download PDF

Info

Publication number
CN113559509A
CN113559509A CN202110870953.3A CN202110870953A CN113559509A CN 113559509 A CN113559509 A CN 113559509A CN 202110870953 A CN202110870953 A CN 202110870953A CN 113559509 A CN113559509 A CN 113559509A
Authority
CN
China
Prior art keywords
relative position
game
target icon
relative
taminated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110870953.3A
Other languages
Chinese (zh)
Other versions
CN113559509B (en
Inventor
马怡梦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110870953.3A priority Critical patent/CN113559509B/en
Publication of CN113559509A publication Critical patent/CN113559509A/en
Application granted granted Critical
Publication of CN113559509B publication Critical patent/CN113559509B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The application provides an information prompting method and device in a game, electronic equipment and a computer readable storage medium, wherein the method comprises the following steps: responding to a taming trigger instruction, controlling a game role to ride an object to be taminated in a game scene, controlling the object to be taminated to move according to a preset movement logic, and determining a first relative position between the object to be taminated and the game role in the movement process of the object to be taminated; determining a deformation mode of the preset position relation component according to the first relative position; determining a second relative position between a target icon corresponding to the object to be tamed and the safety region mark according to the first relative position; and displaying the position relation component on the graphical user interface in a deformation mode, and displaying the target icon according to the second relative position. This application scheme builds the sense of depth through the position relation subassembly of deformation, reaches pseudo-3D's visual effect to the process is tamed in the show directly perceived, promotes the sense of immersing of tame process.

Description

Information prompting method and device in game, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an information prompting method and apparatus in a game, an electronic device, and a computer-readable storage medium.
Background
In a large game, a player may control a game character to find and discipline a beast (e.g., horse, cow, tiger, pig, etc.) as a ride in some game scenarios (e.g., wilderness). In the existing scheme, a player controls a game character to gradually approach an object to be domesticated in a game scene, and frequently clicks an icon on a graphical user interface to pacify the object to be domesticated, so that the aim of domesticating the object to be domesticated is fulfilled. Referring to fig. 1, which is a schematic diagram of a graphical user interface for a training task in the related art, as shown in fig. 1, in the process that a player controls a game character at the lower left corner to approach a horse to be trained, the horse is prevented from being frightened by frequently clicking an icon at the lower right corner, so as to complete the training. However, the simulation degree of the domestication process is too low to visually display the violence degree of the confrontation with the beasts in the actual beast domestication process.
Disclosure of Invention
An object of the embodiments of the present application is to provide an information prompting method and apparatus in a game, an electronic device, and a computer-readable storage medium, which are used to visually display a disciplining process of an object to be disciplined.
In one aspect, the present application provides an information prompting method in a game, where the game includes a game scene, a game character and an object to be disciplined, and the method includes:
responding to a taming trigger instruction, controlling the game role to ride the object to be taminated, and controlling the object to be taminated to move according to preset motion logic;
determining a first relative position between the object to be taminated and the game character during the movement of the object to be taminated;
determining a deformation mode of a preset position relation component according to the first relative position; wherein the position relation component comprises a safety region identifier;
determining a second relative position between a target icon corresponding to the object to be tamed and the safety region identifier according to the first relative position;
and displaying the position relation component on a graphical user interface in the deformation mode, and displaying the target icon according to the second relative position.
In an embodiment, the position relationship component includes a closed graph and a safety region identifier, and the safety region identifier is located inside the closed graph.
In an embodiment, the closed figure is a circular color block, and the safety region identifier is located in the center of the circular color block.
In an embodiment, after the displaying the position relation component in the deformed manner and the displaying the target icon according to the second relative position, the method further includes:
responding to a somatosensory control instruction, and adjusting the motion of the object to be domesticated based on the inclination direction and the inclination angle in the somatosensory control instruction;
determining a first relative position between the adjusted object to be taminated and the game character;
and returning to the step of determining the deformation mode of the preset position relation component according to the first relative position.
In one embodiment, the adjusting the motion of the object to be domesticated based on the inclination direction and the inclination angle in the somatosensory control instruction comprises:
converting the designated motion amplitude of the object to be tamed according to the inclination angle;
and moving the object to be domesticated towards the inclined direction until the motion amplitude reaches the specified motion amplitude.
In one embodiment, the first relative position includes a relative distance and a relative direction between a designated part of the object to be taming and a center position of the game character;
the method for determining the deformation mode of the preset position relation component according to the first relative position comprises the following steps:
converting the deformation inclination angle of the position relation component according to the relative distance;
and when the position relation component is determined to incline towards the relative direction to reach the deformation inclination angle, projecting an image on the graphical user interface, and taking the projected image as the deformation mode.
In an embodiment, the displaying the target icon according to the second relative position includes:
if the relative distance between the target icon and the safe area identification in the second relative position is zero, the target icon is shown in a first designated color corresponding to a tame state.
In an embodiment, the displaying the target icon according to the second relative position includes:
if the relative distance between the target icon and the safety zone identification in the second relative position is not zero, the target icon is shown in a second designated color corresponding to an unfurled state.
In an embodiment, the method further comprises:
and if the relative distance between the target icon and the safe area identifier in the second relative position is zero, outputting vibration prompt information corresponding to state change when the relative distance between the target icon and the safe area identifier changes.
In an embodiment, the method further comprises:
and if the duration of the relative distance between the target icon and the safety region mark in the second relative position is zero reaches a preset duration threshold, outputting prompt information corresponding to successful discipline.
In an embodiment, the method further comprises:
starting a timing function when the game character rides on the object to be tamed;
and when a preset time duration threshold is reached, outputting prompt information corresponding to successful taming if the relative distance between the target icon and the safety region mark in the second relative position is zero.
In an embodiment, the method further comprises:
and displaying the moving picture corresponding to the object to be tamed on the graphical user interface in the moving process of the object to be tamed.
On the other hand, the present application also provides an information presentation apparatus in a game, the game including a game scene, a game character and an object to be disciplined, the apparatus including:
the control module is used for responding to a taming trigger instruction, controlling a game role to ride the object to be taminated and controlling the object to be taminated to move according to preset motion logic;
a first determining module, configured to determine a first relative position between the object to be domesticated and the game character during the motion of the object to be domesticated;
the conversion module is used for determining the deformation mode of the preset position relation component according to the first relative position; wherein the position relation component comprises a safety region identifier;
the second determining module is used for determining a second relative position between a target icon corresponding to the object to be tamed and the safety region mark according to the first relative position;
and the display module is used for displaying the position relation component on a graphical user interface in the deformation mode and displaying the target icon according to the second relative position.
Further, the present application also provides an electronic device, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the information prompting method in the game.
In addition, the application also provides a computer readable storage medium, wherein the storage medium stores a computer program, and the computer program can be executed by a processor to complete the information prompting method in the game.
According to the scheme, the game role is controlled to ride the object to be tamed in response to a tame trigger instruction, the object to be tamed is controlled to move according to the preset motion logic, in the process of the movement of the tame object, the deformation mode of the position relation component is determined according to the first relative position between the object to be tamed and the game role, the second relative position between the target icon corresponding to the object to be tamed and the safety region identifier in the position relation component is determined, the position relation component is displayed on the graphical user interface in the deformation mode, the target icon is displayed according to the second relative position, the depth sense can be created, the pseudo-3D visual effect is achieved, the tame process is visually displayed, and the immersion sense of the tame process is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required to be used in the embodiments of the present application will be briefly described below.
FIG. 1 is a schematic illustration of a task-taming graphical user interface of the related art;
fig. 2 is a schematic view of an application scenario of an information prompting method in a game according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating an in-game information prompting method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a task-taming graphical user interface provided by an embodiment of the present application;
6a-6b are schematic diagrams of deformed position relationship components provided in accordance with an embodiment of the present application;
FIG. 7 is a schematic diagram of a tilt process of a positional relationship component according to an embodiment of the present application;
FIG. 8 is a schematic view of a task-taming graphical user interface provided by another embodiment of the present application;
fig. 9 is a block diagram of an information presentation device in a game according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
Like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Fig. 2 is a schematic view of an application scenario of an information prompting method in a game according to an embodiment of the present application. As shown in fig. 2, the application scenario includes a motion sensing recognition device 20 and a user terminal 30; the somatosensory recognition device 20 is configured to recognize the somatosensory control instruction and send the somatosensory control instruction to the user terminal 30; the user terminal 30 may be a host, a mobile phone, a tablet computer, or other device, and is configured to execute a corresponding operation in the game according to the somatosensory control instruction, and may execute an information prompting task in the game. Furthermore, in an embodiment, the user terminal 30 itself may be used as a motion sensing recognition device.
As shown in fig. 3, the present embodiment provides an electronic apparatus 1 including: at least one processor 11 and a memory 12, one processor 11 being exemplified in fig. 2. The processor 11 and the memory 12 are connected by a bus 10, and the memory 12 stores instructions executable by the processor 11, and the instructions are executed by the processor 11, so that the electronic device 1 can execute all or part of the flow of the method in the embodiments described below. In an embodiment, the electronic device 1 may be the user terminal 30 described above, and is configured to execute an information prompting method in a game.
The Memory 12 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk.
The present application also provides a computer-readable storage medium storing a computer program executable by the processor 11 to perform the in-game information presentation method provided by the present application.
Referring to fig. 4, a flow chart of an information prompting method in a game provided by an embodiment of the present application is illustrated, where the game includes a game scene, a game character and an object to be disciplined, and as shown in fig. 4, the method may include the following steps 410 to 450.
Step 410: and responding to the taming trigger instruction, controlling the game role to ride on the object to be taminated, and controlling the object to be taminated to move according to preset movement logic.
Wherein the game character may be a virtual character in the game that is controlled by the player. The object to be domesticated may be an animal that can be ridden, such as a horse, a cow, a tiger, or the like.
The taming trigger instruction is used for triggering the taming task, can be generated by clicking the taming control by a player, and can also be generated by a preset game event (for example, a game character enters a taming task copy).
Motion logic may include motion patterns of back and forth shaking, side to side shaking, running, jumping, head shaking, etc.
After monitoring the discipline triggering instruction, the user terminal can control the game role to ride on the object to be disciplined and control the object to be disciplined to move according to the movement logic, thereby representing the discipline process of the object to be disciplined breaking loose from the control of the game role.
Step 420: during the movement of the object to be taminated, a first relative position between the object to be taminated and the game character is determined.
The first relative position may include a relative distance and a relative direction of the object to be disciplined in the game scene with respect to the game character. Since the game character rides on the object to be disciplined during the disciplining process, in order to more accurately describe the movement tendency of the object to be disciplined with respect to the game character, in an embodiment, the first relative position may include a relative distance and a relative direction between a designated part of the object to be disciplined and a central position of the game character; here, the designated portion of the object to be domesticated may be the head of the object to be domesticated, and the center position of the game character may be the position where the head of the game character is located.
During the motion of the object to be taminated, the user terminal may determine a first relative position between the object to be taminated and the game character.
Step 430: determining a deformation mode of the preset position relation component according to the first relative position; wherein the position relation component is a safety region identification.
Step 440: and determining a second relative position between the target icon corresponding to the object to be taminated and the safety region mark according to the first relative position.
Wherein, the position relation component is used for indicating the position relation between the game character and the object to be domesticated. The positional relationship component may include a secure zone identification that maps a secure zone in the game scene. When the object to be tamed is in the tame state, the object to be tamed in the game scene is in a safe area, and a target icon corresponding to the object to be tamed is located in the safe area identification. In one embodiment, the position relationship component may include a closed figure and a safety zone identifier, the safety zone identifier may be located inside the closed figure, and the closed figure may be a circle, a polygon, an ellipse, or the like.
Referring to fig. 5, which is a schematic diagram of a graphical user interface for a taming task provided in an embodiment of the present application, as shown in fig. 5, a circular component above the interface is a position relationship component, a gray area in the position relationship component is a safety area identifier, and an icon inside the safety area identifier is a target icon. At this time, the target icon is located within the safe area identification, indicating that the object to be tamed is in the tame state.
The user terminal may determine a deformation manner of the positional relationship component based on the first relative position, which may describe the first relative position of the object to be cared with respect to the game character during the movement.
Within the positional relationship component, a second relative position between the target icon and the safe zone identification may be determined by the above-mentioned first relative position in the game scene.
Illustratively, if the object to be domesticated is calm and motionless, the relative distance between the designated part of the object to be domesticated and the center position of the game role can be determined and is marked as d; if the object to be disciplined shakes sharply, the maximum relative distance between the designated part of the object to be disciplined and the center position perceived by the game can be determined and is denoted as D. A circular safety area in a game scene can be determined by taking the central position as the center of a circle and the distance d as the radius; the maximum motion area of the object to be disciplined in the game scene relative to the game role can be determined by taking the central position as the center of a circle and the distance D as the radius.
If the position relation component is circular, the center of the circle of the position relation component corresponds to the center position of the game character, and the radius R of the position relation component and the radius R of the safety area in the position relation component can be determined by scaling the distance D and the distance D in an equal proportion mode. Such as: in the case where the radius R, distance D, and distance D of the position relationship component have been determined, D may be scaled according to the ratio of R to D to obtain the radius R.
After determining the first relative position, the user terminal may determine a distance between the target icon in the position relationship component and a center of the position relationship component according to the relative distance in the first relative position. If the distance is larger than the radius of the safe area, the target icon is positioned outside the safe area; otherwise, the target icon is located within the safe area. Further, the relative direction in the first relative position is the same as the relative direction of the target icon in the second relative position to the center of the safe area, and illustratively, if the designated portion of the object to be taminated is the head, the center position of the game character is the position of the head of the game character, and when the object to be taminated is shaken to the left front of the game character, the target icon in the positional relationship component should be positioned at the upper left of the center of the circle.
A second relative position between the target icon and the safety zone may be determined upon determining a distance between the target icon in the positional relationship component and a center of the positional relationship component, and a relative direction of the target icon with respect to the center of the positional relationship component.
Step 450: and displaying the position relation component on the graphical user interface in a deformation mode, and displaying the target icon according to the second relative position.
After determining the deformation mode of the position relation component and the second relative position between the target icon and the safety area, the user terminal may display the position relation component according to the deformation mode, and display the target icon on the position relation component at the second relative position.
Through the measures, after the game role rides on the object to be disciplined in the game scene, the deformation mode of the position relation component is determined according to the first relative position between the object to be disciplined and the game role when the object to be disciplined struggles, and the second relative position between the target icon corresponding to the object to be disciplined and the safe area; the position relation component is displayed in a deformation mode, so that a pseudo-3D presentation effect can be achieved, the confrontation degree with the object to be domesticated in the domestication process can be visually displayed, and the target icon is displayed in the second relative position, so that the target required to be completed by the domestication task can be more clearly displayed.
In one embodiment, the closed figure in the position relation component is a circular color block, the safety region identifier in the position relation component is located in the center of the circular color block, and the safety region identifier and the closed figure can be concentric circles.
Referring to fig. 6a-6b, which are schematic diagrams of the deformed position relationship component according to an embodiment of the present application, as shown in fig. 6a, when the designated portion of the object to be taminated is located at the left front of the central position of the game character, the upper left portion to the lower right portion of the position relationship component is compressed, and a pseudo 3D visual effect inclined toward the upper left portion is presented; as shown in fig. 6b, when the designated portion of the object to be disciplined is positioned right-front of the center position of the game character, the upper right to lower left of the positional relationship component is compressed, and a pseudo 3D visual effect that is inclined to the upper right is presented.
In an embodiment, after the user terminal displays the position relation component and the target icon, the somatosensory control instruction can be acquired. The somatosensory control instruction can be generated by a somatosensory recognition device which is in butt joint with the user terminal, or can be generated by the user terminal. The somatosensory control instruction may include a tilt direction and a tilt angle for the positional relationship component. In an embodiment, the motion-sensing control instruction may be generated when the player performs a tilt operation on the user terminal, and the tilt direction and the tilt angle in the motion-sensing control instruction may be the tilt direction and the tilt angle of the user terminal during the tilt operation. The user terminal can obtain the inclination direction and the inclination angle from the local gyroscope and construct a somatosensory control instruction.
The user terminal can respond to the somatosensory control instruction and adjust the motion of the object to be domesticated based on the inclination direction and the inclination angle in the somatosensory control instruction. In the ideal case, the inclination direction is exactly opposite to the relative direction in the first relative position, and the inclination angle can be converted into the amplitude of the movement of the object to be domesticated. After the user terminal moves the designated part of the object to be domesticated according to the inclination direction and the inclination angle, the motion state of the object to be domesticated is adjusted, so that the original motion trend of the object to be domesticated is resisted, and the domesticating process is shown with high simulation degree.
After the object to be disciplined is adjusted, the user terminal may determine a first relative position of the adjusted object to be disciplined and the game character. Here, the user terminal may newly determine the first relative position between the adjusted designated part of the object to be taminated and the center position of the game character.
After determining the first relative position, the user terminal may return to step 430 and re-perform steps 430 through 450.
Through the measures, after the relative position relation between the object to be domesticated and the game role is displayed based on the deformed position relation component and the target icon, the pseudo 3D presentation effect of the deformed position relation component can enhance the energy-display guidance, so that the body sensing control instruction generated by the body sensing control is obtained, the movement of the object to be domesticated can be adjusted according to the body sensing control instruction, and the playability and the immersion of the domesticating task are improved.
In an embodiment, when the user terminal adjusts the motion of the object to be domesticated based on the inclination direction and the inclination angle in the somatosensory control instruction, the specified motion amplitude of the object to be domesticated can be converted according to the inclination angle. Here, the specified motion amplitude is a motion distance or a motion angle of a specified portion of the object to be domesticated in accordance with the somatosensory control instruction. The movement distance may represent a distance in which the designated part of the object to be disciplined moves with respect to the center position of the game character in the horizontal direction. The movement angle may be an angle generated by a movement of a designated part of the object to be domesticated in a vertical direction, which is a line connecting the designated part of the object to be domesticated and the center position of the game character in an oblique direction.
Referring to fig. 7, which is a schematic diagram of the tilting process of the positional relationship component provided in the embodiment of the present application, as shown in fig. 7, the target icon is located at the upper left of the safety region identifier in the positional relationship component, and the positional relationship component presents a pseudo 3D effect of tilting in the direction indicated by the solid arrow. The tilt direction in the somatosensory control instruction is opposite to the tilt direction of the component indicated by the current pseudo-3D effect, which is indicated by the dashed arrow in fig. 7. The inclination angle in the somatosensory control command is an angle of inclination of a plane where the position relation component is located in the inclination direction. In fig. 7, "initial plane" is a plane where the position relation component is located before being adjusted based on the motion sensing control instruction, "inclined rear plane" is a plane where the position relation component is located after being adjusted based on the motion sensing control instruction, "initial plane" and "inclined rear plane" form an inclination angle.
The maximum tilt angle for the positional relationship component may be preconfigured, and for example, the maximum tilt angle may be 30 degrees. In an embodiment, if the motion sensing control command is generated when the player performs a tilt operation on the user terminal, when the tilt angle of the user terminal is greater than the maximum tilt angle allowed by the positional relationship component, it may be considered that the tilt angle in the raising control command is the maximum tilt angle allowed by the positional relationship component.
In the horizontal direction, the designated part of the object to be disciplined is outside a circular safe area determined by taking the central position of the game role as the center of a circle and the distance D as the radius in a game scene, and when the designated part shakes in a maximum movement area determined by taking the central position of the game role as the center of a circle and the distance D as the radius, the position relation component is deformed. Therefore, in the case where the positional relationship component can be deformed, the maximum movement distance of the specified portion of the object to be trained is D-D, and in the case where the maximum inclination angle Q of the positional relationship component has been determined, it is possible to determine (D-D)/Q as the movement distance corresponding to each degree, and (D-D)/Q as a conversion ratio of the movement distance to the inclination angle.
In the vertical direction, the designated part of the object to be domesticated is moved so that the line formed by the designated part of the object to be domesticated and the center position of the game character forms a movement angle generated by the movement. The maximum range of the movement angle may be set to Q in advance, and in the case where the maximum tilt angle Q of the position relation component has been determined, Q/Q may be determined as the movement angle corresponding to each tilt angle, and Q/Q is a conversion ratio of the movement angle to the tilt angle.
The user terminal can convert the inclination angle into the designated motion amplitude of the object to be domesticated according to the conversion ratio. After determining the designated movement distance, the user terminal may move the object to be domesticated in a direction inclined in the somatosensory control instruction until the magnitude of the movement reaches the designated movement magnitude. On one hand, if the object to be domesticated moves in the horizontal direction, the designated movement amplitude is the movement distance, the user terminal can convert the designated movement distance according to the inclination angle and control the object to be domesticated to move towards the inclination direction until the movement distance reaches the designated movement distance. On the other hand, if the object to be domesticated moves in the vertical direction, the designated movement amplitude is designated as the movement angle, and the user terminal may convert the designated movement angle according to the inclination angle and control the object to be domesticated to swing in the vertical direction of the inclination direction to reach the designated movement angle.
In an embodiment, when the user terminal determines the deformation mode of the position relationship component according to the first relative position, the deformation inclination angle of the position relationship component may be converted according to the relative distance in the first relative position. Wherein, the deformation inclination angle is an inclination angle at which the position relation component presents a pseudo 3D effect. As shown in fig. 7, the deformation inclination angle is an angle at which the positional relationship component is inclined in the direction indicated by the solid line angle.
The user terminal can convert the relative distance into the deformation inclination angle according to the conversion ratio of the relative distance and the inclination angle.
After determining the deformed tilt angle, the user terminal may determine that the position relationship component projects an image on the graphical user interface when the position relationship component tilts towards the opposite direction in the first relative position to reach the deformed tilt angle. The user terminal may use the projection image as a deformation mode of the positional relationship component.
Through the measures, the pseudo-3D effect can be presented according to the projection image, and the inclined state of the position relation component can be visually displayed.
In one embodiment, when the user terminal displays the target icon according to the second relative position, if the relative distance between the target icon in the second relative position and the safety zone identifier in the position relationship component is zero, the user terminal may display the target icon in the first designated color corresponding to the tame state. Here, the first designated color may be preconfigured; for example, the first designated color may be a color similar to the color of the safety area identifier, for example, the color of the safety area identifier is green, and the first designated color may be dark green.
When the target icon is located in the safe area mark, the user terminal renders and displays the target icon in the first designated color, and then the user terminal can intuitively indicate that the object to be disciplined is in the disciplined state.
In one embodiment, when the user terminal displays the target icon according to the second relative position, if the relative distance between the target icon in the second relative position and the safety zone identifier in the position relationship component is not zero, in other words, the target icon is located outside the safety zone identifier, the user terminal may display the target icon in the second designated color corresponding to the non-disciplined state. Here, the second designated color may be preconfigured; for example, the second designated color may be a color that is different from the color of the safety area identifier, for example, the color of the safety area identifier is green, and the second designated color may be red.
When the target icon is not located in the safety area mark, the user terminal renders and displays the target icon in a second designated color, and then the situation that the object to be disciplined is in an unretaminated state can be intuitively explained.
In one embodiment, if the relative distance between the target icon and the safety zone identifier in the second relative position is zero, when the relative distance between the target icon and the safety zone identifier changes, vibration indication information corresponding to the change in state may be output. The vibration prompt information may be generated by the user terminal vibration for explaining that the target icon leaves the safe region identification, i.e., the object to be tamed is changed from the tamed state to the unfurled state.
By the aid of the method, when the object to be disciplined starts shaking and struggling in a game scene, the intensity of the disciplining task can be intuitively explained through the vibration prompt information, and immersion and simulation of the game are improved.
In an embodiment, during the disciplining process, if a duration in which a relative distance between the target icon and the safety zone identifier in the second relative position is zero reaches a preset duration threshold, the user terminal may output a prompt message corresponding to successful discipline. In other words, when the duration of the target icon located in the safety area identifier reaches the duration threshold, the user terminal may prompt that the tame is successful. Here, the time length threshold may be preconfigured, and for example, when the duration of the target icon located in the safety zone identifier reaches 5 seconds, the user terminal may prompt that the taming is successful in the form of text, voice, short video, and the like.
For example, during the domestication process, if the inclination direction in the body-sensing operation instruction is close enough to the opposite direction of the relative direction in the first relative position, and the movement distance converted by the inclination angle in the body-sensing operation instruction is approximate to the relative distance in the first relative position, the movement trend of the object to be domesticated relative to the game character gradually becomes gentle. The inclination angle gradually becomes smaller in the somatosensory operation instruction required by the taming process. When the object to be tamed does not shake any more, the relative distance between the target icon and the safety region mark in the second relative position becomes zero, and if the duration that the relative distance between the target icon and the safety region mark in the second relative position is zero reaches a duration threshold, the user terminal can determine that the tame is successful, and at the moment, the user terminal can output corresponding prompt information.
In one embodiment, during the taming process, the user terminal may initiate a timing function when the game character rides on the object to be taminated. Illustratively, the user terminal may prompt for a timing situation by presenting a timing control in a graphical user interface. When the preset time duration threshold is reached, if the relative distance between the target icon and the safety region identifier in the second relative position is zero, the user terminal may output a prompt message corresponding to successful taming. Here, the timing duration threshold may be a duration of the taming task, and for example, after the game character rides on the object to be taminated, the user terminal may start a timing function, when the timing duration reaches 30 seconds, the taming task is ended, and if the target icon is located within the safety zone identifier at the time of the taming task being ended, the user terminal may prompt that the taming is successful in the form of text, voice, short video, and the like.
In one embodiment, after the game character rides on the object to be disciplined, the user terminal may display a moving picture of the object to be disciplined on the graphical user interface during the movement of the object to be disciplined. Referring to fig. 8, a schematic diagram of a graphical user interface for a taming task according to another embodiment of the present application is provided, as shown in fig. 8, when the graphical user interface displays a deformed position relationship component, and simultaneously displays a moving picture of an object to be taminated, which is violent in resisting taming, so that the immersion of the taming task can be improved.
Fig. 9 is an information presenting apparatus in a game according to an embodiment of the present invention, where the game includes a game scene, and a game character and an object to be disciplined located in the game scene, as shown in fig. 9, the apparatus may include:
the control module 910 is configured to, in response to a taming trigger instruction, control a game character to ride on the object to be taminated, and control the object to be taminated to move according to a preset motion logic;
a first determining module 920, configured to determine a first relative position between the object to be disciplined and the game character during the motion of the object to be disciplined;
a conversion module 930, configured to determine a deformation mode of the preset position relationship component according to the first relative position; wherein the position relation component comprises a safety region identifier;
a second determining module 940, configured to determine, according to the first relative position, a second relative position between the target icon corresponding to the object to be tamed and the safety area identifier;
a display module 950, configured to display the position relation component on a graphical user interface in the deformation manner, and display the target icon according to the second relative position.
The implementation process of the functions and actions of each module in the device is specifically described in the implementation process of the corresponding step in the information prompting method in the game, and is not described herein again.
In the embodiments provided in the present application, the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (15)

1. An information prompting method in a game, wherein the game comprises a game scene, a game character and an object to be disciplined, and the game character and the object are positioned in the game scene, and the method is characterized by comprising the following steps:
responding to a taming trigger instruction, controlling the game role to ride the object to be taminated, and controlling the object to be taminated to move according to preset motion logic;
determining a first relative position between the object to be taminated and the game character during the movement of the object to be taminated;
determining a deformation mode of a preset position relation component according to the first relative position; wherein the position relation component comprises a safety region identifier;
determining a second relative position between a target icon corresponding to the object to be tamed and the safety region identifier according to the first relative position;
and displaying the position relation component on a graphical user interface in the deformation mode, and displaying the target icon according to the second relative position.
2. The method of claim 1, wherein the positional relationship component comprises a closed graph and a safety zone identifier, the safety zone identifier being located inside the closed graph.
3. The method of claim 2, wherein the closed figure is a circular patch and the safety zone indicator is located in the center of the circular patch.
4. The method of claim 1, wherein after said displaying the positional relationship component in the morphed manner and displaying the target icon in accordance with the second relative position, the method further comprises:
responding to a somatosensory control instruction, and adjusting the motion of the object to be domesticated based on the inclination direction and the inclination angle in the somatosensory control instruction;
determining a first relative position between the adjusted object to be taminated and the game character;
and returning to the step of determining the deformation mode of the preset position relation component according to the first relative position.
5. The method of claim 4, wherein the adjusting the motion of the object to be domesticated based on the tilt direction and tilt angle in the somatosensory control instruction comprises:
converting the designated motion amplitude of the object to be tamed according to the inclination angle;
and moving the object to be domesticated towards the inclined direction until the motion amplitude reaches the specified motion amplitude.
6. The method according to claim 1, wherein the first relative position includes a relative distance and a relative direction between a designated part of the object to be cared for and a center position of the game character;
the method for determining the deformation mode of the preset position relation component according to the first relative position comprises the following steps:
converting the deformation inclination angle of the position relation component according to the relative distance;
and when the position relation component is determined to incline towards the relative direction to reach the deformation inclination angle, projecting an image on the graphical user interface, and taking the projected image as the deformation mode.
7. The method of claim 1, wherein said presenting the target icon in accordance with the second relative position comprises:
if the relative distance between the target icon and the safe area identification in the second relative position is zero, the target icon is shown in a first designated color corresponding to a tame state.
8. The method of claim 1, wherein said presenting the target icon in accordance with the second relative position comprises:
if the relative distance between the target icon and the safety zone identification in the second relative position is not zero, the target icon is shown in a second designated color corresponding to an unfurled state.
9. The method of claim 1, further comprising:
and if the relative distance between the target icon and the safe area identifier in the second relative position is zero, outputting vibration prompt information corresponding to state change when the relative distance between the target icon and the safe area identifier changes.
10. The method of claim 1, further comprising:
and if the duration of the relative distance between the target icon and the safety region mark in the second relative position is zero reaches a preset duration threshold, outputting prompt information corresponding to successful discipline.
11. The method of claim 1, further comprising:
starting a timing function when the game character rides on the object to be tamed;
and when a preset time duration threshold is reached, outputting prompt information corresponding to successful taming if the relative distance between the target icon and the safety region mark in the second relative position is zero.
12. The method of claim 1, further comprising:
and displaying the moving picture corresponding to the object to be tamed on the graphical user interface in the moving process of the object to be tamed.
13. An information presentation apparatus in a game including a game scene and a game character and an object to be disciplined which are located in the game scene, comprising:
the control module is used for responding to a taming trigger instruction, controlling a game role to ride the object to be taminated and controlling the object to be taminated to move according to preset motion logic;
a first determining module, configured to determine a first relative position between the object to be domesticated and the game character during the motion of the object to be domesticated;
the conversion module is used for determining the deformation mode of the preset position relation component according to the first relative position; wherein the position relation component comprises a safety region identifier;
the second determining module is used for determining a second relative position between a target icon corresponding to the object to be tamed and the safety region mark according to the first relative position;
and the display module is used for displaying the position relation component on a graphical user interface in the deformation mode and displaying the target icon according to the second relative position.
14. An electronic device, characterized in that the electronic device comprises:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the in-game information prompting method of any one of claims 1-12.
15. A computer-readable storage medium, characterized in that the storage medium stores a computer program executable by a processor to perform the method of information presentation in a game according to any one of claims 1 to 12.
CN202110870953.3A 2021-07-30 2021-07-30 Information prompting method and device in game, electronic equipment and storage medium Active CN113559509B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110870953.3A CN113559509B (en) 2021-07-30 2021-07-30 Information prompting method and device in game, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110870953.3A CN113559509B (en) 2021-07-30 2021-07-30 Information prompting method and device in game, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113559509A true CN113559509A (en) 2021-10-29
CN113559509B CN113559509B (en) 2024-04-16

Family

ID=78169466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110870953.3A Active CN113559509B (en) 2021-07-30 2021-07-30 Information prompting method and device in game, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113559509B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050043097A1 (en) * 2003-08-21 2005-02-24 Spidermonk Entertainment, Llc Interrelated game and information portals provided within the context of an encompassing virtual world
CN1607022A (en) * 2003-10-14 2005-04-20 阿鲁策株式会社 Game system, gaming machine used in such game system, and gaming method
KR101304379B1 (en) * 2012-04-02 2013-09-11 최영동 Method and system for remote domestication of the animals oninternet
CN107694085A (en) * 2017-10-24 2018-02-16 网易(杭州)网络有限公司 The control method and device of game role and equipment, touch apparatus, storage medium
JP6418299B1 (en) * 2017-09-15 2018-11-07 株式会社セガゲームス Information processing apparatus and program
CN110694266A (en) * 2019-10-23 2020-01-17 网易(杭州)网络有限公司 Game state synchronization method, game state display method and game state display device
WO2020258225A1 (en) * 2019-06-28 2020-12-30 瑞声声学科技(深圳)有限公司 Gamepad and gamepad vibration method and apparatus
CN112561113A (en) * 2019-09-25 2021-03-26 华为技术有限公司 Dangerous scene early warning method and terminal equipment
CN112604289A (en) * 2020-12-16 2021-04-06 深圳中清龙图网络技术有限公司 Game map generation method, device, equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050043097A1 (en) * 2003-08-21 2005-02-24 Spidermonk Entertainment, Llc Interrelated game and information portals provided within the context of an encompassing virtual world
CN1607022A (en) * 2003-10-14 2005-04-20 阿鲁策株式会社 Game system, gaming machine used in such game system, and gaming method
KR101304379B1 (en) * 2012-04-02 2013-09-11 최영동 Method and system for remote domestication of the animals oninternet
JP6418299B1 (en) * 2017-09-15 2018-11-07 株式会社セガゲームス Information processing apparatus and program
CN107694085A (en) * 2017-10-24 2018-02-16 网易(杭州)网络有限公司 The control method and device of game role and equipment, touch apparatus, storage medium
WO2020258225A1 (en) * 2019-06-28 2020-12-30 瑞声声学科技(深圳)有限公司 Gamepad and gamepad vibration method and apparatus
CN112561113A (en) * 2019-09-25 2021-03-26 华为技术有限公司 Dangerous scene early warning method and terminal equipment
CN110694266A (en) * 2019-10-23 2020-01-17 网易(杭州)网络有限公司 Game state synchronization method, game state display method and game state display device
CN112604289A (en) * 2020-12-16 2021-04-06 深圳中清龙图网络技术有限公司 Game map generation method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN113559509B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
CN111913624B (en) Interaction method and device for objects in virtual scene
US9919216B2 (en) Video game processing program, video game processing system and video game processing method
EP3456487A2 (en) Robot, method of controlling the same, and program
JP6298563B1 (en) Program and method for providing virtual space by head mounted device, and information processing apparatus for executing the program
WO2022237275A1 (en) Information processing method and apparatus and terminal device
CN108965702B (en) Shooting guiding method and wearable device
KR20210137127A (en) Operation control method and apparatus, storage medium and device
JP4450799B2 (en) Method for controlling target image detection apparatus
CN111905369A (en) Display control method and device in game and electronic equipment
JP6798925B2 (en) Virtual reality game device and virtual reality game program
CN111311719A (en) Display processing method and device in game
CN113559509A (en) Information prompting method and device in game, electronic equipment and storage medium
CN107993410B (en) Environmental-based wading early warning method and wearable device
CN108600715B (en) Projection control method and projection equipment
CN109543563B (en) Safety prompting method and device, storage medium and electronic equipment
WO2021021585A1 (en) Object scanning for subsequent object detection
JP2022171671A (en) Imaging system, imaging method, imaging program and stuffed toy
CN114053701A (en) Display method, device and storage medium
CN114210051A (en) Carrier control method, device, equipment and storage medium in virtual scene
CN111013135A (en) Interaction method, device, medium and electronic equipment
KR20210004479A (en) Augmented reality-based shooting game method and system for child
CN111068308A (en) Data processing method, device, medium and electronic equipment based on mouth movement
US20180140951A1 (en) Computer-readable recording medium, computer apparatus, image display method
CN115291733B (en) Cursor control method and device
JP7260673B2 (en) Video display device and video display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant