CN114028803A - Feedback control method and device for NPC (game non-player character) and storage medium - Google Patents

Feedback control method and device for NPC (game non-player character) and storage medium Download PDF

Info

Publication number
CN114028803A
CN114028803A CN202111425724.7A CN202111425724A CN114028803A CN 114028803 A CN114028803 A CN 114028803A CN 202111425724 A CN202111425724 A CN 202111425724A CN 114028803 A CN114028803 A CN 114028803A
Authority
CN
China
Prior art keywords
touch
target
npc
interaction
progress value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111425724.7A
Other languages
Chinese (zh)
Inventor
方盛元
杨靖民
张威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202111425724.7A priority Critical patent/CN114028803A/en
Publication of CN114028803A publication Critical patent/CN114028803A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a feedback control method and a device of NPC (non-player character), a storage medium and computer equipment, wherein the method comprises the following steps: responding to a touch mode trigger signal of a three-dimensional game, and displaying a target game NPC at a preset touch shooting visual angle, wherein the target game NPC is provided with at least one interaction area, each interaction area is configured with a collision body, and the preset touch shooting visual angle is a first-person visual angle; determining a touch position of the touch body based on control operation data of the touch body, performing collision detection on the touch body and the collision body, and determining a target collision body which collides with the touch body; and acquiring a target touch feedback behavior corresponding to the target collision body, and controlling the target game NPC to execute the target touch feedback behavior. This application helps promoting and smooths the authenticity of touching the method of playing in the recreation.

Description

Feedback control method and device for NPC (game non-player character) and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for controlling feedback of an NPC of a non-player character of a game, a storage medium, and a computer device.
Background
MMORPGs (Massive Multiplayer Online Role Playing games) often use a large number of NPCs (non-player characters) to perform stacking, so as to provide more real and interesting Game experiences for players, and make players feel that the Game world is more real and the Game experiences are more abundant. The interaction with the NPC in the game generally comprises properties sending, ensemble, same-trip shooting, double-person action interaction and topic conversation.
At present, a stroking and touching playing method is provided in some two-dimensional hand games, the NPC can feed back stroking and touching operations of player characters, and based on touch operations of players on a screen, the stroking and touching actions of the NPC are simulated to control the NPC to feed back. However, the touch playing method in the two-dimensional game is not strong in reality sense, and is difficult to meet the continuously improved game requirements of players.
Disclosure of Invention
In view of this, the present application provides a feedback control method and apparatus, a storage medium, and a computer device for a game non-player character NPC, which are helpful to improve the sense of reality of the touch play method.
According to an aspect of the present application, there is provided a feedback control method of a game non-player character NPC, including:
responding to a touch mode trigger signal of a three-dimensional game, and displaying a target game NPC at a preset touch shooting visual angle, wherein the target game NPC is provided with at least one interaction area, each interaction area is configured with a collision body, and the preset touch shooting visual angle is a first-person visual angle;
determining a touch position of the touch body based on control operation data of the touch body, performing collision detection on the touch body and the collision body, and determining a target collision body which collides with the touch body;
and acquiring a target touch feedback behavior corresponding to the target collision body, and controlling the target game NPC to execute the target touch feedback behavior.
Optionally, after the target game NPC is displayed at the preset tacting shooting view angle, the method further includes:
displaying at least one visual angle adjusting option, and determining a first shooting visual angle corresponding to the selected visual angle adjusting option based on the selection operation of the visual angle adjusting option, and displaying the target game NPC at the first shooting visual angle, wherein the visual angle adjusting option comprises at least one distance adjusting option and/or at least one direction adjusting option; or the like, or, alternatively,
and determining a second shooting visual angle based on the free adjustment operation of the shooting visual angle, and displaying the target game NPC at the second shooting visual angle.
Optionally, the method further comprises:
and replacing a cursor in a display interface with a preset touch body icon in response to a touch mode trigger signal of the three-dimensional game, wherein the display position of the preset touch body icon is associated with the touch position of the touch body.
Optionally, the obtaining of the target touch feedback behavior corresponding to the target collision body specifically includes:
obtaining feedback behavior influence information, wherein the feedback behavior influence information comprises at least one of a stroking mode corresponding to the control operation data, intimacy between the target game NPC and a stroking character, a real-time game environment, a first real-time state of the target game NPC, a second real-time state of the stroking character and a real-time interaction progress value between the target game NPC and the stroking character;
and determining a target touch feedback behavior corresponding to the target collision body according to the feedback behavior influence information.
Optionally, after determining a target touch feedback behavior corresponding to the target collision volume according to the feedback behavior influence information, the method further includes:
and acquiring a progress value change amount corresponding to the target touch feedback behavior, and updating the real-time interaction progress value according to the progress value change amount.
Optionally, after the target game NPC is displayed at the preset tacting shooting view angle, the method further includes:
and displaying interaction state dynamic information between the target game NPC and the placard character, wherein the interaction state dynamic information is dynamically changed based on a real-time interaction progress value between the target game NPC and the placard character.
Optionally, the interaction state dynamic information includes an interaction state progress bar, the interaction state progress bar is attached with a level feature of an interaction state level corresponding to the real-time interaction progress value, and the level feature includes a color feature and/or a special effect feature.
Optionally, after the displaying of the dynamic information of the interaction state between the target game NPC and the placard character, the method further includes:
calculating the real-time interaction progress value according to a preset progress value change rule, and dynamically updating the interaction state dynamic information, wherein the preset progress value change rule comprises a progress value descending speed corresponding to at least one interaction progress value interval, and each interaction progress value interval corresponds to a respective interaction state level;
accordingly, the method further comprises:
executing a first exit operation of exiting the touch mode when the real-time interaction progress value is lower than a preset lowest interaction progress value and/or the duration of the real-time interaction progress value kept in an interaction progress value interval corresponding to the lowest interaction state level exceeds a preset first duration;
and when the real-time interaction progress value is higher than a preset highest interaction progress value and/or the duration of the real-time interaction progress value kept in an interaction progress value interval corresponding to the highest interaction state level exceeds a preset second duration, executing a second exit operation of exiting the touch mode.
Optionally, the method further comprises:
updating an affinity between the target game NPC and the tentacle color based on the first exit operation and/or the second exit operation.
Optionally, the method further comprises:
and displaying stroking and touching guide information based on the real-time interaction progress value, wherein the stroking and touching guide information comprises a guide stroking and touching mode for improving the real-time interaction progress value.
According to another aspect of the present application, there is provided a feedback control apparatus of a game NPC, including:
the display module is used for responding to a touch mode trigger signal of the three-dimensional game and displaying a target game NPC at a preset touch shooting visual angle, wherein the target game NPC is provided with at least one interaction area, each interaction area is provided with a collision body, and the preset touch shooting visual angle is a first human visual angle;
the collision detection module is used for determining the touch position of the touch body based on control operation data of the touch body, performing collision detection on the touch body and the collision body, and determining a target collision body which collides with the touch body;
and the feedback control module is used for acquiring a target touch feedback behavior corresponding to the target collision body and controlling the target game NPC to execute the target touch feedback behavior.
Optionally, the apparatus further comprises:
the visual angle adjusting module is used for displaying at least one visual angle adjusting option after the target game NPC is displayed at a preset touch shooting visual angle, determining a first shooting visual angle corresponding to the selected visual angle adjusting option based on the selection operation of the visual angle adjusting option, and displaying the target game NPC at the first shooting visual angle, wherein the visual angle adjusting option comprises at least one distance adjusting option and/or at least one direction adjusting option; or, a second shooting angle of view is determined based on the free adjustment operation of the shooting angle of view, and the target game NPC is displayed at the second shooting angle of view.
Optionally, the display module is further configured to replace a cursor in a display interface with a preset touch body icon in response to a touch mode trigger signal of the three-dimensional game, where a display position of the preset touch body icon is associated with a touch position of the touch body.
Optionally, the feedback control module is specifically configured to:
obtaining feedback behavior influence information, wherein the feedback behavior influence information comprises at least one of a stroking mode corresponding to the control operation data, intimacy between the target game NPC and a stroking character, a real-time game environment, a first real-time state of the target game NPC, a second real-time state of the stroking character and a real-time interaction progress value between the target game NPC and the stroking character;
and determining a target touch feedback behavior corresponding to the target collision body according to the feedback behavior influence information.
Optionally, the apparatus further comprises:
and the calculation module is used for acquiring a progress value change amount corresponding to the target touch feedback behavior after determining the target touch feedback behavior corresponding to the target collision body according to the feedback behavior influence information, and updating the real-time interaction progress value according to the progress value change amount.
Optionally, the display module is further configured to display dynamic interaction state information between the target game NPC and the tactile character after the target game NPC is displayed at the preset tactile shooting angle, where the dynamic interaction state information is dynamically changed based on a real-time interaction progress value between the target game NPC and the tactile character.
Optionally, the interaction state dynamic information includes an interaction state progress bar, the interaction state progress bar is attached with a level feature of an interaction state level corresponding to the real-time interaction progress value, and the level feature includes a color feature and/or a special effect feature.
Optionally, the computing module is further configured to:
after the interactive state dynamic information between the NPC and the haptic character of the target game is displayed, calculating the real-time interactive progress value according to a preset progress value change rule, and dynamically updating the interactive state dynamic information, wherein the preset progress value change rule comprises a progress value descending speed corresponding to at least one interactive progress value interval, and each interactive progress value interval corresponds to a respective interactive state grade;
accordingly, the feedback control module is further configured to:
executing a first exit operation of exiting the touch mode when the real-time interaction progress value is lower than a preset lowest interaction progress value and/or the duration of the real-time interaction progress value kept in an interaction progress value interval corresponding to the lowest interaction state level exceeds a preset first duration;
and when the real-time interaction progress value is higher than a preset highest interaction progress value and/or the duration of the real-time interaction progress value kept in an interaction progress value interval corresponding to the highest interaction state level exceeds a preset second duration, executing a second exit operation of exiting the touch mode.
Optionally, the apparatus further comprises:
and the intimacy updating module is used for updating intimacy between the target game NPC and the tentacle color based on the first quitting operation and/or the second quitting operation.
Optionally, the apparatus further comprises:
and the stroking and touching guide module is used for displaying stroking and touching guide information based on the real-time interaction progress value, wherein the stroking and touching guide information comprises a guide stroking and touching mode for improving the real-time interaction progress value.
According to yet another aspect of the present application, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the feedback control method of the above-described game non-player character NPC.
According to yet another aspect of the present application, there is provided a computer apparatus comprising a storage medium, a processor, and a computer program stored on the storage medium and executable on the processor, the processor implementing the feedback control method of the game non-player character NPC described above when executing the program.
By means of the technical scheme, in the running process of the three-dimensional game, the feedback control method and device of the game non-player character NPC, the storage medium and the computer equipment enter a touch mode of the game in response to a touch mode trigger signal, the target game NPC is displayed at a preset touch shooting visual angle of a first person under the touch mode, corresponding collision bodies are configured in each interaction area of the target game NPC, when the touch bodies collide with the collision bodies in the target game NPC under the control operation of a player, a target touch feedback behavior corresponding to the collided target collision bodies is obtained, and the target game NPC is controlled to execute the target touch feedback behavior, so that the target game NPC generates feedback to the touch of the player. The embodiment of the application fills the gap that the existing three-dimensional game lacks a touch playing method, and the collision bodies are configured in different interaction areas of the game NPC, so that when a player controls the touch body to simulate the touch of the game NPC, the game NPC is controlled to give corresponding feedback based on the difference of the positions of the game NPC touched by the player, the game playing method is enriched, the game reality is improved, and the immersive game experience is brought to the player.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow chart illustrating a feedback control method for a NPC (non-player character) in a game according to an embodiment of the present application;
FIG. 2 is a flow chart illustrating another method for controlling feedback of an NPC (non-player character) provided by an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating a display effect of a game NPC provided by an embodiment of the present application;
fig. 4 is a schematic structural diagram of a feedback control device of another game NPC provided in the embodiment of the present application.
Detailed Description
The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In the present embodiment, a feedback control method of an NPC of a non-player character of a game is provided, as shown in fig. 1, the method including:
step 101, responding to a touch mode trigger signal of a three-dimensional game, and displaying a target game NPC at a preset touch shooting visual angle, wherein the target game NPC is provided with at least one interaction area, each interaction area is configured with a collision body, and the preset touch shooting visual angle is a first person visual angle.
The embodiment of the application can be applied to three-dimensional games, a touch playing method of the three-dimensional game NPC in the three-dimensional games is provided, specific games can enter a touch mode through active selection operation of a player, and the touch mode can also be entered when a player character moves to the touch mode triggering range of the target NPC.
In the running process of the three-dimensional game, responding to a touch mode trigger signal in the game, entering a touch mode of a target game NPC, switching a shooting visual angle of a game virtual camera to a preset touch shooting visual angle, so that the game virtual camera shoots the target game NPC at the preset touch shooting visual angle, and displaying a game picture shot by the virtual camera and containing the target game NPC on a game display interface, wherein the preset touch shooting visual angle is a first person visual angle. In the shooting by the virtual camera, the player looks at the target game NPC from the first-person perspective in the game screen, and the virtual camera can shoot at a position closer to the target game NPC. The target game NPC providing the touch play method is configured with a plurality of interaction areas in advance, for example, the target game NPC is a certain character in the game, the interaction areas can comprise the head, shoulders, back, left arm, right arm, left hand, right hand and other parts of the character, each interaction area is configured with a respective collision body, the collision body of each interaction area can be a bounding box wrapping the interaction area on the target game NPC model, for example, the collision body corresponding to the head is a bounding box comprising a hand model, collision detection on the interaction area is realized through the collision body, and if collision of a certain collision body is detected, the interaction area corresponding to the collision body is touched. In addition, the degree of fineness of the division of the interactive zone may be different for different game NPCs.
And 102, determining the touch position of the touch body based on the control operation data of the touch body, performing collision detection on the touch body and the collision body, and determining a target collision body which collides with the touch body.
The touch body is a medium for stroking the target game NPC in a stroking playing method, the touch body can be a hand model of a player character, the stroking of a hand on a virtual human is simulated, the touch body can also be a cat teasing model, the stroking of the cat teasing stick on the virtual cat is simulated and operated, and the specific form of the touch body is not limited in the embodiment of the application. The player can input control operation data through input equipment such as a mouse, a keyboard and the like, control over the touch body in the game is achieved, and specifically, the touch position of the touch body in the game can be changed through modes such as mouse movement, keyboard input and the like. Meanwhile, based on the touch body box and the collision body, along with the continuous change of the touch position of the touch body, the touch body and the collision body are subjected to collision detection in real time, the touch body collides with the collision body, the touch body generates touch on a certain interaction area of the target game NPC, and the target collision body colliding with the touch body is obtained, so that the interaction area where the target collision body is located can be determined based on the corresponding relation between each collision body and the target game NPC interaction area, and the touch body can be determined to which position of the target game NPC is touched.
Step 103, obtaining a target touch feedback behavior corresponding to the target collision body, and controlling the target game NPC to execute the target touch feedback behavior.
For the player's strokes on different interaction zones of the target game NPC, the target game NPC may give different feedback, and the feedback of NPC includes dialog text, bubble text, voice, one-time expression action, continuous expression action (e.g., blush), model special effect (e.g., getting a sense of mind on the body), and the like. For example, it is detected that the head collision body of the virtual dog collides with the touch body, it is determined that the player stroked the head of the virtual dog, the virtual dog may give feedback of shaking the tail, and for example, it is detected that the front paw collision body of the virtual dog collides with the touch body, it is determined that the player stroked the front paw of the virtual dog, and the virtual dog may give feedback of shaking hands with the stroking character.
By applying the technical scheme of the embodiment, in the running process of the three-dimensional game, a touch mode of the game is entered in response to a touch mode trigger signal, a target game NPC is displayed at a preset touch shooting visual angle called by a first person in the touch mode, each interaction area of the target game NPC is provided with a corresponding collision body, when the touch body collides with the collision body in the game under the control operation of a player, a target touch feedback behavior corresponding to the collided target collision body is obtained, and the target game NPC is controlled to execute the target touch feedback behavior, so that the target game NPC can generate feedback to the touch of the player. The embodiment of the application fills the gap that the existing three-dimensional game lacks a touch playing method, and the collision bodies are configured in different interaction areas of the game NPC, so that when a player controls the touch body to simulate the touch of the game NPC, the game NPC is controlled to give corresponding feedback based on the difference of the positions of the game NPC touched by the player, the game playing method is enriched, the game reality is improved, and the immersive game experience is brought to the player.
Further, as a refinement and an extension of the specific implementation of the above embodiment, in order to fully illustrate the implementation process of the embodiment, another feedback control method for the NPC of the non-player character of the game is provided, as shown in fig. 2, the method includes:
step 201, responding to a touch mode touch signal of a three-dimensional game, and displaying a target game NPC at a preset touch shooting visual angle, wherein the target game NPC is provided with at least one interaction area, each interaction area is configured with a collision body, and the preset touch shooting visual angle is a first-person visual angle; replacing a cursor in a display interface with a preset touch body icon, wherein the display position of the preset touch body icon is associated with the touch position of the touch body;
in the embodiment of the application, after entering the touch mode, the target game NPC is displayed at a preset touch shooting view angle, a cursor in a display interface is replaced by a preset touch body icon, and the display position of the preset touch body icon is the touch position of the touch body. The effect that the first person looks at the target game NPC from the visual angle and the touch body enters the visual line range is simulated, so that the simulated play method is more real, for example, when the game enters a touch mode, the cursor is automatically replaced by an icon of a hand, a player can select the touch body to be used in the touch mode or before the game enters the touch mode, and the icon is displayed based on the touch body selected by the player in the touch mode.
Step 202, displaying at least one visual angle adjusting option, and determining a first shooting visual angle corresponding to the selected visual angle adjusting option based on the selection operation of the visual angle adjusting option, and displaying the target game NPC at the first shooting visual angle, wherein the visual angle adjusting option comprises at least one distance adjusting option and/or at least one direction adjusting option; or, a second shooting angle of view is determined based on the free adjustment operation of the shooting angle of view, and the target game NPC is displayed at the second shooting angle of view.
In the above embodiment, in the touch mode, the shooting angle of view of the virtual camera may be adjusted, specifically, a plurality of adjustment options may be preset, and the shooting angle of view of the virtual camera is adjusted based on the selection of the adjustment options by the player. For example, three distance adjustment options of far, middle and near, and four direction adjustment options of front, back, left and right are set, and the player can realize the switching of 12 fixed shooting visual angles by selecting the distance adjustment option and the direction adjustment option. As shown in fig. 3, the short-distance, medium-distance, and long-distance forward display screens of the target game NPC and the medium-distance backward display screen of the target game NPC are respectively shown. In addition, the shooting visual angle of the virtual camera can be freely adjusted, and a player can input instructions through a mouse and a keyboard to freely adjust the shooting visual angle. Of course, adjustment options and free adjustment may be combined, and the player may first determine a rough shooting angle of view by selecting an adjustment option and then perform fine angle adjustment by free adjustment.
Step 203, displaying interaction state dynamic information between the target game NPC and the tentacle color, wherein the interaction state dynamic information is dynamically changed based on a real-time interaction progress value between the target game NPC and the tentacle color.
Step 204, calculating the real-time interaction progress value according to a preset progress value change rule, and dynamically updating the interaction state dynamic information, wherein the preset progress value change rule includes a progress value descending speed corresponding to at least one interaction progress value interval, and each interaction progress value interval corresponds to a respective interaction state level.
In the above embodiment, in the touch mode, the display interface may further display dynamic information of an interaction state between the target game NPC and a touch character (i.e., a player character), and dynamically prompt a real-time interaction state of the target game NPC of the player through the information, where the information is dynamically displayed based on a real-time interaction progress value between the target game NPC and the touch character. The initial interaction state dynamic information can be determined based on a preset initial interaction progress value, after the touch mode playing method starts, the real-time interaction progress value can be changed based on touch operation of a player on the NPC of the target game, and can also change along with time growth, in an actual application scene, a plurality of interaction state levels can be preset, and interaction progress value areas corresponding to the interaction state levels are divided, for example, the interaction states are divided into three levels which respectively represent the excitation state, the common state and the cold desert state of the NPC from high to low, the interaction progress value interval corresponding to the excitation state is 100-200, the interaction progress value interval corresponding to the common state is 0-100, and the interaction progress value interval corresponding to the cold desert state is-100-0. In the touch mode, if the touch character is not touched for a period of time, the interaction progress value may decrease at a certain speed, and the decrease speeds of the progress values corresponding to different interaction state levels may be different, for example, the decrease speed in the excited state is 10 points/second, and the decrease speed in the normal state is 1 point/second.
Optionally, the interaction state dynamic information includes an interaction state progress bar, the interaction state progress bar is attached with a level feature of an interaction state level corresponding to the real-time interaction progress value, and the level feature includes a color feature and/or a special effect feature.
The interactive state progress bar displayed in the display interface can specifically show a level feature matched with the level according to the interactive state level corresponding to the real-time interactive progress value. For example, 3 interactive status levels are displayed from low to high as a red progress bar, a pink progress bar and a blue progress bar respectively. For example, the progress bar corresponding to the high-level interaction state is accompanied by a love effect, and the progress bar corresponding to the low-level interaction state is accompanied by a cold effect.
Step 205, determining the touch position of the touch body based on the control operation data of the touch body, performing collision detection on the touch body and the collision body, and determining a target collision body which collides with the touch body.
Step 206, obtaining feedback behavior influence information, where the feedback behavior influence information includes at least one of a stroking manner corresponding to the control operation data, an intimacy between the target game NPC and a stroking character, a real-time game environment, a first real-time state of the target game NPC, a second real-time state of the stroking character, and a real-time interaction progress value between the target game NPC and the stroking character.
Step 207, determining a target touch feedback behavior corresponding to the target collision body according to the feedback behavior influence information, and controlling the target game NPC to execute the target touch feedback behavior.
In the above embodiment, in addition to different feedbacks generated by the target game NPC due to different touches on different parts of the target game NPC, other information that may affect the NPC feedback may be included, and after detecting that the touch object collides with the target collision object under the control operation of the player, feedback behavior influence information that affects the feedback behavior of the target collision object is acquired, and the target touch feedback behavior of the target game NPC at this time is determined based on the information. In a specific application scenario, the feedback behavior impact information may include one or more types. Wherein, feedback action influence information can include that the player bumps the touching mode of body to the target, and the touching mode specifically can include and smooths the dynamics of touching, smooths the number of times, smooths and touches gimmick etc. for example the player can carry out touching of different modes to NPC through modes such as clicking mouse, double click mouse, long pressing mouse, clicking drag, for example click and indicate to lightly touch, double click indicates to touch etc. in succession. The feedback behavior influence information may further include intimacy between the stroking character and the target game NPC, for example, when the intimacy between the stroking character and the virtual dog is low, the dog may avoid when the player stroks the back of the virtual dog, and when the intimacy between the player and the virtual dog is high, the dog may shake the tail when the player stroks the back of the virtual dog. The feedback behavior impact information may further include real-time game environment in the game, and specifically may include time in the game, weather, whether the current game scenario progress is a specific festival, and the like. The feedback behavior impact information may further include a first real-time status of the target game NPC, and specifically may include a dress makeup of the NPC, an additional buff status on the NPC, and the like. The feedback behavior influence information can also comprise a second real-time state of the stroking character, and specifically can comprise dress makeup of the stroking character, a buff state attached to the stroking character body and the like. The feedback behavior influence information may further include a real-time interaction progress value between the target game NPC and the placido color, for example, the interaction state corresponding to the real-time interaction progress value of the virtual dog is an excited state, the dog may lie on the ground and show a comfortable appearance when the player touches the abdomen of the virtual dog, and if the virtual dog is in a desert state, the dog may bark when the player touches the abdomen of the virtual dog. In addition, the feedback behavior of the target game NPC in the game is also related to the character setting of the NPC, and the touch feedback behaviors of the same collision body corresponding to the game NPCs with different character settings may be different for the same feedback behavior influence information.
Further, after the target touch feedback behavior is determined based on the one or more feedback behavior influence information and the target interaction area corresponding to the target collision body, the target game NPC is controlled to execute the corresponding feedback behavior. So that the game NPC can give different feedback aiming at different stroked parts under different conditions.
In this embodiment of the present application, optionally, step 103 or step 205 specifically includes: determining a touch position of a touch body based on control operation data of the touch body, correcting the touch position based on acoustic wave input data, performing collision detection on the touch body and a collision body, and determining a target collision body colliding with the touch body; alternatively, the first and second electrodes may be,
and determining the touch position of the touch body based on the sound wave input data, performing collision detection on the touch body and the collision body, and determining a target collision body which collides with the touch body.
In this embodiment, the touch object in the game can be controlled by input data of an input device such as a mouse or a keyboard, and can also acquire acoustic wave data by a sound receiving device, analyze the intention of the player by performing voice semantic analysis on the acoustic wave data, and directly control the touch position of the touch object based on the intention of the player, or correct the touch position based on the acquired acoustic wave data after determining the touch position of the touch object based on control operation data of the player on the touch object. For example, the movement of the touch object to a certain position is controlled by a player through a mouse and a keyboard input, the touch position is finely-grained corrected through voice input, and the movement of the touch object to the touch position is controlled directly based on the voice input of the player.
Optionally, the stroking pattern is determined based on the control operation data or acoustic input data. The player can also control the touch mode of the touch body to the target game NPC through a voice input mode.
And 208, obtaining a progress value change amount corresponding to the target touch feedback behavior, and updating the real-time interaction progress value according to the progress value change amount.
In the above embodiment, while the target game NPC executes the target touch feedback behavior, the real-time interaction progress value of the target game NPC may also change correspondingly, and specifically, the real-time interaction progress value of the NPC may be changed synchronously after the NPC executes the touch feedback behavior by presetting progress value change amounts corresponding to different touch feedback behaviors. For example, the progress value change amount corresponding to the feedback behavior of the tail shaking of the virtual dog is the progress with 10 points, and the progress value change amount corresponding to the feedback behavior of the avoidance of the virtual dog is the progress with 10 points. The progress value change amount may be set according to the level of the interactive state, for example, the feedback of the virtual dog wagging tail in the excited state is added with 50-point progress, and the feedback of the virtual dog wagging tail in the normal state is added with 10-point progress. In addition, after the real-time interaction progress value is updated, the interaction state dynamic information in the display interface is changed accordingly.
Step 209, when the real-time interaction progress value is lower than a preset lowest interaction progress value and/or the duration of the real-time interaction progress value kept in an interaction progress value interval corresponding to the lowest interaction state level exceeds a preset first duration, executing a first exit operation of exiting the touch mode;
step 210, when the real-time interaction progress value is higher than a preset highest interaction progress value and/or the duration of the real-time interaction progress value kept in an interaction progress value interval corresponding to the highest interaction state level exceeds a preset second duration, executing a second exit operation of exiting the touch mode.
In the above embodiment, the stroking mode may be ended based on the active quit operation of the player, or may be automatically ended based on a change in the real-time interaction progress value. If the real-time interaction progress value is lower than a preset lowest interaction progress value, or the interaction state is kept at the lowest interaction state level for a long time, or both are met, the pacifying and touching mode can be quitted, the pacifying and touching playing method is finished, a first quitting operation is executed, for example, when the target game NPC is in a cool state and exceeds a preset first time, animation of the target game NPC expelling the pacifying and touching role is played. Correspondingly, if the real-time interaction progress value is higher than the preset highest interaction progress value, or the interaction state is kept at the highest interaction state level for a long time, or both of the real-time interaction progress value and the interaction state are simultaneously satisfied, the stroking touch mode can be quitted, the stroking touch playing method is ended, a second quitting operation is executed, for example, the real-time interaction progress value of the target game NPC exceeds 200 points, and an animation which is distracted by the target game NPC and ends the stroking touch mode is played. In addition, after the comforting mode is quitted, the virtual camera can be switched to a third person to call a view angle to shoot and comforting the roles.
In the embodiment of the present application, optionally, the method further includes: updating an affinity between the target game NPC and the tentacle color based on the first exit operation and/or the second exit operation.
In the above embodiment, the stroking of the player on the target game NPC may foster the intimacy between the stroking character and the target game NPC, and expand the fostering-type game play method, and based on the first quitting operation and the second quitting operation in the game, the intimacy between the stroking character and the target game NPC may be changed, specifically, the intimacy may be reduced based on the first quitting operation, and the intimacy may be increased based on the second quitting operation, so that each stroking-type play method is not an isolated play method in the game, and may be combined with other play methods (e.g., intimacy), and in a specific game scenario, the stroking-type play method may be expanded to the fostering play method of the player on the game NPC in the game, for example, the relationship fostering for a virtual dog, the relationship fostering for a virtual character, and the like.
In the embodiment of the present application, optionally, the method further includes: and displaying stroking and touching guide information based on the real-time interaction progress value, wherein the stroking and touching guide information comprises a guide stroking and touching mode for improving the real-time interaction progress value.
In the above embodiment, in the touch mode, the operation guidance may be performed on the player, so as to avoid that the game experience is affected by the player due to unfamiliar playing methods, and specifically, the touch guidance information may be displayed in the display interface, where the touch guidance information includes a guidance touch manner for improving the real-time interaction progress value, for example, a touch position and a touch manner capable of improving the interaction progress value of the target game NPC may be determined according to information such as intimacy between the target game NPC and the touch role, an interaction state level of the target game NPC, a real-time state of the touch role, and a real-time game environment, so as to display an interaction area corresponding to the touch position and an operation manner of an input device such as a mouse and a keyboard corresponding to the touch manner. The player can more efficiently perform the stroking in a manner that the target game NPC "likes" under the prompt of the stroking guide information.
In addition, under the non-touch mode of game NPC, still include: determining a target behavior point of the target game NPC based on at least one of real-time game environment information, behavior information of a preset game NPC, state information of the preset game NPC, behavior information of a player character, state information of the player character, and interaction information between the player character and the target game NPC; and controlling the target game NPC to move to the target behavior point by a preset route or performing automatic route searching, and controlling the target game NPC to execute the target behavior corresponding to the target behavior point.
Further, as a specific implementation of the method in fig. 1, an embodiment of the present application provides a feedback control apparatus for an NPC game, as shown in fig. 4, the apparatus includes:
the display module is used for responding to a touch mode trigger signal of the three-dimensional game and displaying a target game NPC at a preset touch shooting visual angle, wherein the target game NPC is provided with at least one interaction area, each interaction area is provided with a collision body, and the preset touch shooting visual angle is a first human visual angle;
the collision detection module is used for determining the touch position of the touch body based on control operation data of the touch body, performing collision detection on the touch body and the collision body, and determining a target collision body which collides with the touch body;
and the feedback control module is used for acquiring a target touch feedback behavior corresponding to the target collision body and controlling the target game NPC to execute the target touch feedback behavior.
Optionally, the apparatus further comprises:
the visual angle adjusting module is used for displaying at least one visual angle adjusting option after the target game NPC is displayed at a preset touch shooting visual angle, determining a first shooting visual angle corresponding to the selected visual angle adjusting option based on the selection operation of the visual angle adjusting option, and displaying the target game NPC at the first shooting visual angle, wherein the visual angle adjusting option comprises at least one distance adjusting option and/or at least one direction adjusting option; or, a second shooting angle of view is determined based on the free adjustment operation of the shooting angle of view, and the target game NPC is displayed at the second shooting angle of view.
Optionally, the display module is further configured to replace a cursor in a display interface with a preset touch body icon in response to a touch mode trigger signal of the three-dimensional game, where a display position of the preset touch body icon is associated with a touch position of the touch body.
Optionally, the feedback control module is specifically configured to:
obtaining feedback behavior influence information, wherein the feedback behavior influence information comprises at least one of a stroking mode corresponding to the control operation data, intimacy between the target game NPC and a stroking character, a real-time game environment, a first real-time state of the target game NPC, a second real-time state of the stroking character and a real-time interaction progress value between the target game NPC and the stroking character;
and determining a target touch feedback behavior corresponding to the target collision body according to the feedback behavior influence information.
Optionally, the apparatus further comprises:
and the calculation module is used for acquiring a progress value change amount corresponding to the target touch feedback behavior after determining the target touch feedback behavior corresponding to the target collision body according to the feedback behavior influence information, and updating the real-time interaction progress value according to the progress value change amount.
Optionally, the display module is further configured to display dynamic interaction state information between the target game NPC and the tactile character after the target game NPC is displayed at the preset tactile shooting angle, where the dynamic interaction state information is dynamically changed based on a real-time interaction progress value between the target game NPC and the tactile character.
Optionally, the interaction state dynamic information includes an interaction state progress bar, the interaction state progress bar is attached with a level feature of an interaction state level corresponding to the real-time interaction progress value, and the level feature includes a color feature and/or a special effect feature.
Optionally, the computing module is further configured to:
after the interactive state dynamic information between the NPC and the haptic character of the target game is displayed, calculating the real-time interactive progress value according to a preset progress value change rule, and dynamically updating the interactive state dynamic information, wherein the preset progress value change rule comprises a progress value descending speed corresponding to at least one interactive progress value interval, and each interactive progress value interval corresponds to a respective interactive state grade;
accordingly, the feedback control module is further configured to:
executing a first exit operation of exiting the touch mode when the real-time interaction progress value is lower than a preset lowest interaction progress value and/or the duration of the real-time interaction progress value kept in an interaction progress value interval corresponding to the lowest interaction state level exceeds a preset first duration;
and when the real-time interaction progress value is higher than a preset highest interaction progress value and/or the duration of the real-time interaction progress value kept in an interaction progress value interval corresponding to the highest interaction state level exceeds a preset second duration, executing a second exit operation of exiting the touch mode.
Optionally, the apparatus further comprises:
and the intimacy updating module is used for updating intimacy between the target game NPC and the tentacle color based on the first quitting operation and/or the second quitting operation.
Optionally, the apparatus further comprises:
and the stroking and touching guide module is used for displaying stroking and touching guide information based on the real-time interaction progress value, wherein the stroking and touching guide information comprises a guide stroking and touching mode for improving the real-time interaction progress value.
It should be noted that other corresponding descriptions of the functional units related to the feedback control device of the game NPC provided in the embodiment of the present application may refer to the corresponding descriptions in the methods in fig. 1 to fig. 2, and are not repeated herein.
Based on the method shown in fig. 1 to 2, correspondingly, the embodiment of the present application further provides a storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the feedback control method for the non-player character NPC shown in fig. 1 to 2.
Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the implementation scenarios of the present application.
Based on the above methods shown in fig. 1 to fig. 2 and the virtual device embodiment shown in fig. 4, in order to achieve the above object, an embodiment of the present application further provides a computer device, which may specifically be a personal computer, a server, a network device, and the like, where the computer device includes a storage medium and a processor; a storage medium for storing a computer program; a processor for executing a computer program to implement the feedback control method of the game non-player character NPC as described above with reference to fig. 1 to 2.
Optionally, the computer device may also include a user interface, a network interface, a camera, Radio Frequency (RF) circuitry, sensors, audio circuitry, a WI-FI module, and so forth. The user interface may include a Display screen (Display), an input unit such as a keypad (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, etc. The network interface may optionally include a standard wired interface, a wireless interface (e.g., a bluetooth interface, WI-FI interface), etc.
It will be appreciated by those skilled in the art that the present embodiment provides a computer device architecture that is not limiting of the computer device, and that may include more or fewer components, or some components in combination, or a different arrangement of components.
The storage medium may further include an operating system and a network communication module. An operating system is a program that manages and maintains the hardware and software resources of a computer device, supporting the operation of information handling programs, as well as other software and/or programs. The network communication module is used for realizing communication among components in the storage medium and other hardware and software in the entity device.
Through the description of the above embodiments, those skilled in the art can clearly understand that the present application may be implemented by software plus a necessary general hardware platform, or may be implemented by hardware, in the running process of a three-dimensional game, a touch mode of the game is entered in response to a touch mode trigger signal, in the touch mode, a target game NPC is displayed at a preset touch shooting view angle called by a first person, each interaction area of the target game NPC is configured with a corresponding collision body, when a touch body collides with a collision body therein under the control operation of a player, a target touch feedback behavior corresponding to the collided target collision body is acquired, and the target game NPC is controlled to execute the target touch feedback behavior, so that the target game NPC generates feedback for the touch of the player. The embodiment of the application fills the gap that the existing three-dimensional game lacks a touch playing method, and the collision bodies are configured in different interaction areas of the game NPC, so that when a player controls the touch body to simulate the touch of the game NPC, the game NPC is controlled to give corresponding feedback based on the difference of the positions of the game NPC touched by the player, the game playing method is enriched, the game reality is improved, and the immersive game experience is brought to the player.
Those skilled in the art will appreciate that the figures are merely schematic representations of one preferred implementation scenario and that the blocks or flow diagrams in the figures are not necessarily required to practice the present application. Those skilled in the art will appreciate that the modules in the devices in the implementation scenario may be distributed in the devices in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the present implementation scenario with corresponding changes. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above application serial numbers are for description purposes only and do not represent the superiority or inferiority of the implementation scenarios. The above disclosure is only a few specific implementation scenarios of the present application, but the present application is not limited thereto, and any variations that can be made by those skilled in the art are intended to fall within the scope of the present application.

Claims (13)

1. A feedback control method for an NPC of a non-player character, comprising:
responding to a touch mode trigger signal of a three-dimensional game, and displaying a target game NPC at a preset touch shooting visual angle, wherein the target game NPC is provided with at least one interaction area, each interaction area is configured with a collision body, and the preset touch shooting visual angle is a first-person visual angle;
determining a touch position of the touch body based on control operation data of the touch body, performing collision detection on the touch body and the collision body, and determining a target collision body which collides with the touch body;
and acquiring a target touch feedback behavior corresponding to the target collision body, and controlling the target game NPC to execute the target touch feedback behavior.
2. The method of claim 1, wherein after displaying the target game NPC at the preset tactual shooting perspective, the method further comprises:
displaying at least one visual angle adjusting option, and determining a first shooting visual angle corresponding to the selected visual angle adjusting option based on the selection operation of the visual angle adjusting option, and displaying the target game NPC at the first shooting visual angle, wherein the visual angle adjusting option comprises at least one distance adjusting option and/or at least one direction adjusting option; or the like, or, alternatively,
and determining a second shooting visual angle based on the free adjustment operation of the shooting visual angle, and displaying the target game NPC at the second shooting visual angle.
3. The method of claim 1, further comprising:
and replacing a cursor in a display interface with a preset touch body icon in response to a touch mode trigger signal of the three-dimensional game, wherein the display position of the preset touch body icon is associated with the touch position of the touch body.
4. The method according to claim 1, wherein the obtaining of the target touch feedback behavior corresponding to the target collision volume specifically comprises:
obtaining feedback behavior influence information, wherein the feedback behavior influence information comprises at least one of a stroking mode corresponding to the control operation data, intimacy between the target game NPC and a stroking character, a real-time game environment, a first real-time state of the target game NPC, a second real-time state of the stroking character and a real-time interaction progress value between the target game NPC and the stroking character;
and determining a target touch feedback behavior corresponding to the target collision body according to the feedback behavior influence information.
5. The method of claim 4, wherein after determining a target-touching feedback behavior corresponding to the target collision volume from the feedback behavior impact information, the method further comprises:
and acquiring a progress value change amount corresponding to the target touch feedback behavior, and updating the real-time interaction progress value according to the progress value change amount.
6. The method of claim 5, wherein after displaying the target game NPC at the preset tactual shooting perspective, the method further comprises:
and displaying interaction state dynamic information between the target game NPC and the placard character, wherein the interaction state dynamic information is dynamically changed based on a real-time interaction progress value between the target game NPC and the placard character.
7. The method of claim 6,
the interactive state dynamic information comprises an interactive state progress bar, the interactive state progress bar is attached with level characteristics of an interactive state level corresponding to the real-time interactive progress value, and the level characteristics comprise color characteristics and/or special effect characteristics.
8. The method of claim 6, wherein after displaying the dynamic information of the interaction state between the target game NPC and the placard character, the method further comprises:
calculating the real-time interaction progress value according to a preset progress value change rule, and dynamically updating the interaction state dynamic information, wherein the preset progress value change rule comprises a progress value descending speed corresponding to at least one interaction progress value interval, and each interaction progress value interval corresponds to a respective interaction state level;
accordingly, the method further comprises:
executing a first exit operation of exiting the touch mode when the real-time interaction progress value is lower than a preset lowest interaction progress value and/or the duration of the real-time interaction progress value kept in an interaction progress value interval corresponding to the lowest interaction state level exceeds a preset first duration;
and when the real-time interaction progress value is higher than a preset highest interaction progress value and/or the duration of the real-time interaction progress value kept in an interaction progress value interval corresponding to the highest interaction state level exceeds a preset second duration, executing a second exit operation of exiting the touch mode.
9. The method of claim 8, further comprising:
updating an affinity between the target game NPC and the tentacle color based on the first exit operation and/or the second exit operation.
10. The method of claim 5, further comprising:
and displaying stroking and touching guide information based on the real-time interaction progress value, wherein the stroking and touching guide information comprises a guide stroking and touching mode for improving the real-time interaction progress value.
11. A feedback control apparatus for an NPC game, comprising:
the display module is used for responding to a touch mode trigger signal of the three-dimensional game and displaying a target game NPC at a preset touch shooting visual angle, wherein the target game NPC is provided with at least one interaction area, each interaction area is provided with a collision body, and the preset touch shooting visual angle is a first human visual angle;
the collision detection module is used for determining the touch position of the touch body based on control operation data of the touch body, performing collision detection on the touch body and the collision body, and determining a target collision body which collides with the touch body;
and the feedback control module is used for acquiring a target touch feedback behavior corresponding to the target collision body and controlling the target game NPC to execute the target touch feedback behavior.
12. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the method of any of claims 1 to 10.
13. A computer device comprising a storage medium, a processor and a computer program stored on the storage medium and executable on the processor, characterized in that the processor implements the method of any one of claims 1 to 10 when executing the computer program.
CN202111425724.7A 2021-11-26 2021-11-26 Feedback control method and device for NPC (game non-player character) and storage medium Pending CN114028803A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111425724.7A CN114028803A (en) 2021-11-26 2021-11-26 Feedback control method and device for NPC (game non-player character) and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111425724.7A CN114028803A (en) 2021-11-26 2021-11-26 Feedback control method and device for NPC (game non-player character) and storage medium

Publications (1)

Publication Number Publication Date
CN114028803A true CN114028803A (en) 2022-02-11

Family

ID=80145702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111425724.7A Pending CN114028803A (en) 2021-11-26 2021-11-26 Feedback control method and device for NPC (game non-player character) and storage medium

Country Status (1)

Country Link
CN (1) CN114028803A (en)

Similar Documents

Publication Publication Date Title
JP5657848B2 (en) Program and server device
JP4883759B2 (en) Program, information storage medium, and image generation system
US20080297492A1 (en) Storage medium storing movement controlling program and movement controlling apparatus
EP3796988B1 (en) Virtual camera placement system
JP6061891B2 (en) Program and game device
JP6379077B2 (en) GAME PROGRAM AND GAME DEVICE
KR20110014423A (en) Apparatus and method of controlling movement of character in computer game
JP6096154B2 (en) Program and server device
JP6829364B2 (en) Game system and computer programs used for it
CN114028803A (en) Feedback control method and device for NPC (game non-player character) and storage medium
JP6953650B1 (en) Game system, program and information processing method
CN112312980A (en) Program, electronic device, method, and system
KR20200080978A (en) Apparatus and method for providing game screen information
JP5190425B2 (en) GAME CONTROL PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP2023548922A (en) Virtual object control method, device, electronic device, and computer program
KR102584901B1 (en) Apparatus and method for sending event information, apparatus and method for displayng event information
JP2017010103A (en) Operation method and program in action game
JP7015404B1 (en) Programs, information processing equipment, methods, and systems
JP7141486B1 (en) Program, information processing device, method, and system
JP6922111B1 (en) Game system, program and information processing method
KR20190127308A (en) Apparatus and method for predicting game user control
JP7482847B2 (en) Program, terminal, and game system
JP7146052B1 (en) Game system, game program and information processing method
JP2020043884A (en) Program and computer system
WO2022130681A1 (en) Game system, program, and information processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination