CN114210063A - Interaction method, device, equipment, medium and program product between virtual objects - Google Patents

Interaction method, device, equipment, medium and program product between virtual objects Download PDF

Info

Publication number
CN114210063A
CN114210063A CN202111531757.XA CN202111531757A CN114210063A CN 114210063 A CN114210063 A CN 114210063A CN 202111531757 A CN202111531757 A CN 202111531757A CN 114210063 A CN114210063 A CN 114210063A
Authority
CN
China
Prior art keywords
virtual object
balance
control
virtual
stage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111531757.XA
Other languages
Chinese (zh)
Inventor
叶成豪
王子奕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111531757.XA priority Critical patent/CN114210063A/en
Publication of CN114210063A publication Critical patent/CN114210063A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method, a device, equipment, a medium and a program product for interaction between virtual objects, and relates to the technical field of computers. The method comprises the following steps: displaying a first virtual object and a second virtual object in a virtual scene; receiving a first control operation on a second virtual object in a first stage of a target interaction event; displaying a visual angle turning action of the first virtual object at a second stage of the target interaction event; and displaying a stage event execution result based on the interactive action execution condition of the real-time observation sight range of the first virtual object and the second virtual object in the second stage. By the method, when the second virtual object adjusts the interaction action according to the observation sight range, the execution condition of the interaction action is displayed according to the state of the second virtual object so as to determine the stage execution result, avoid playing games only by means of the visual angle turning action, and improve the entertainment of the games.

Description

Interaction method, device, equipment, medium and program product between virtual objects
Technical Field
Embodiments of the present disclosure relate to the field of virtual environments, and in particular, to a method, an apparatus, a device, a medium, and a program product for interaction between virtual objects.
Background
With the improvement of the living standard of cultural entertainment, the living experience and the requirement of people on the virtual world are higher and higher, and the game is used as an expression mode of the virtual world and becomes a channel for releasing pressure of a plurality of people. In the current game application, various interaction modes exist, and the playing interestingness of a player in the game process is fully improved.
In the related art, when a player controls a virtual character to play a wooden game, the player controls the motion process of the virtual character according to a cue tone played in a virtual scene and the turning motion of the wooden character, for example: controlling a virtual character to run or stop running in the virtual scene, etc.
However, in the above process, when the player plays the game, the movement state of the virtual character only needs to be continuously controlled according to the prompt tone and the action of the wooden person, when the pause operation of the player is received, the virtual character immediately stops moving, the game result is judged according to the result of stopping moving, the judgment condition is single, and the entertainment of the game is low.
Disclosure of Invention
The embodiment of the application provides an interaction method, device, equipment, medium and program product between virtual objects, which can be used for carrying out a game process by combining the observation sight range of a first virtual object and the action state of a second virtual object, and improving the entertainment of the game. The technical scheme is as follows.
In one aspect, a method for interaction between virtual objects is provided, the method including:
displaying a first virtual object and a second virtual object in a virtual scene, wherein the first virtual object is used for carrying out periodic prompt in a target interaction event, and the second virtual object is a virtual object participating in the target interaction event;
receiving a first control operation on the second virtual object at a first stage of the target interaction event, wherein the first control operation is used for controlling the second virtual object to execute an interaction action corresponding to the target interaction event;
displaying a visual angle turning action of the first virtual object at a second stage of the target interaction event, wherein the visual angle turning action is used for adjusting the observation sight range of the first virtual object;
and displaying a periodic event execution result of the second virtual object in the second stage based on the interactive action execution condition of the real-time observation sight range of the first virtual object and the second virtual object in the second stage.
In another aspect, an apparatus for interaction between virtual objects is provided, the apparatus comprising:
displaying a first virtual object and a second virtual object in a virtual scene, wherein the first virtual object is used for carrying out periodic prompt in a target interaction event, and the second virtual object is a virtual object participating in the target interaction event;
receiving a first control operation on the second virtual object at a first stage of the target interaction event, wherein the first control operation is used for controlling the second virtual object to execute an interaction action corresponding to the target interaction event;
displaying a visual angle turning action of the first virtual object at a second stage of the target interaction event, wherein the visual angle turning action is used for adjusting the observation sight range of the first virtual object;
and displaying a periodic event execution result of the second virtual object in the second stage based on the interactive action execution condition of the real-time observation sight range of the first virtual object and the second virtual object in the second stage.
In another aspect, a computer device is provided, which includes a processor and a memory, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the method for interaction between virtual objects according to any of the embodiments of the present application.
In another aspect, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by a processor to implement a method of interaction between virtual objects as described in any of the embodiments of the present application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the interaction method between the virtual objects according to any of the above embodiments.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
when the target interaction event is carried out, the second virtual object carries out interaction action based on the first control operation, the interaction action is adjusted according to the stage prompt provided by the first virtual object, and finally the stage execution result of the second virtual object is determined. By the method, when the second virtual object carries out the target interaction event, the interaction action in the virtual scene is required to be carried out based on the first control operation, the interaction action is required to be adjusted according to the observation sight range of the first virtual object, and the execution condition of the interaction action is displayed based on the adjustment result of the interaction action and the action state of the second virtual object, so that the stage execution result of the second virtual object is determined, the game result is prevented from being judged only by depending on a single influence factor of the visual angle turning action of the first virtual object in the game process, the influence factor of game success or failure and the competitive excitation point are increased, and the interest of the game is effectively improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block diagram of an electronic device provided in an exemplary embodiment of the present application;
FIG. 2 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a method of interaction between virtual objects provided by an exemplary embodiment of the present application;
FIG. 4 is an interface diagram of a method of interaction between virtual objects provided by an exemplary embodiment of the present application;
FIG. 5 is an interface diagram of a method of interaction between virtual objects provided by another exemplary embodiment of the present application;
FIG. 6 is an interface diagram of a method of interaction between virtual objects provided by another exemplary embodiment of the present application;
FIG. 7 is an interface diagram of a method of interaction between virtual objects provided by another exemplary embodiment of the present application;
FIG. 8 is a flowchart of a method of interaction between virtual objects provided by another exemplary embodiment of the present application;
FIG. 9 is a graphical illustration of a difficulty of balancing coefficient provided by an exemplary embodiment of the present application;
FIG. 10 is a flowchart illustrating a second control operation performed by different balance control screens according to an exemplary embodiment of the present application;
FIG. 11 is an interface diagram of a control distribution screen provided by an exemplary embodiment of the present application;
FIG. 12 is an interface diagram of a pointer control screen provided by an exemplary embodiment of the present application;
FIG. 13 is an interface diagram of a movement control screen provided by an exemplary embodiment of the present application;
FIG. 14 is a flowchart of a method of interaction between virtual objects provided by another exemplary embodiment of the present application;
FIG. 15 is a block diagram illustrating an interaction apparatus between virtual objects according to an exemplary embodiment of the present disclosure;
FIG. 16 is a block diagram of an interaction device between virtual objects according to another exemplary embodiment of the present application;
fig. 17 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application will be briefly described.
Virtual world: is a virtual world that is displayed (or provided) when an application program runs on a terminal. The virtual world may be a simulated world of a real world, a semi-simulated semi-fictional world, or a purely fictional world. The virtual world may be any one of a two-dimensional virtual world, a 2.5-dimensional virtual world, and a three-dimensional virtual world, which is not limited in this application. The following embodiments are exemplified in the case where the virtual world is a three-dimensional virtual world.
Virtual model: is a model in a virtual world that mimics the real world. Illustratively, the virtual model occupies a certain volume in the virtual world. Illustratively, the virtual model includes: the system comprises a terrain model, a building model, an animal and plant model, a virtual prop model, a virtual carrier model and a virtual role model. For example, the terrain model includes: ground, mountains, water flows, stones, steps, and the like; the building model includes: house, enclosure, container, and fixed facilities inside the building: tables, chairs, cabinets, beds, etc.; the animal and plant model comprises: trees, flowers, plants, birds, etc.; the virtual prop model comprises: firearms, medicine boxes, air drops, etc.; the virtual vehicle model includes: automobiles, boats, helicopters, etc.; the virtual character model comprises: humans, animals, cartoon characters, etc.
Virtual roles: refers to a movable object in a virtual world. The movable object may be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the three-dimensional virtual world. Optionally, the virtual character is a three-dimensional volumetric model created based on animated skeletal techniques. Each virtual character has its own shape and volume in the three-dimensional virtual world, and occupies a part of the space in the three-dimensional virtual world.
In the related art, when a player controls a virtual character to play a wooden game, the player controls the motion process of the virtual character according to a cue tone played in a virtual scene and the turning motion of the wooden character, for example: controlling a virtual character to run or stop running in the virtual scene, etc. However, in the above-described process, when the player plays the game, it is only necessary to control the movement state of the virtual character by paying attention to the cue tone and the action of the wooden person, and when the pause operation of the player is received, the virtual character stops moving instantly, and the difficulty of the game is weakened in the game in which the emphasis is placed on being kept still, so that the entertainment of the game is low.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4) player, and so on. The terminal is installed and operated with an application program supporting a virtual environment, such as an application program supporting a three-dimensional virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, a Third-Person Shooting game (TPS), a First-Person Shooting game (FPS), and a Multiplayer Online tactical sports game (MOBA). Alternatively, the application program may be a stand-alone application program, such as a stand-alone three-dimensional game program, or may be a network online application program.
Fig. 1 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 100 includes: an operating system 120 and application programs 122.
Operating system 120 is the base software that provides applications 122 with secure access to computer hardware.
Application 122 is an application that supports a virtual environment. Optionally, application 122 is an application that supports a three-dimensional virtual environment. The application 122 may be any one of a virtual reality application, a three-dimensional map program, a TPS game, an FPS game, an MOBA game, and a multi-player gun battle type live game. The application 122 may be a stand-alone application, such as a stand-alone three-dimensional game program, or may be a network-connected application.
Fig. 2 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 200 includes: a first device 220, a server 240, and a second device 260.
The first device 220 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a TPS game, an FPS game, an MOBA game and a multi-player gunfight survival game. The first device 220 is a device used by a first user who uses the first device 220 to control a first virtual object located in a virtual environment for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual character, such as a simulated persona or an animated persona.
The first device 220 is connected to the server 240 through a wireless network or a wired network.
The server 240 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 240 is used to provide background services for applications that support a three-dimensional virtual environment. Alternatively, server 240 undertakes primary computing work and first device 220 and second device 260 undertakes secondary computing work; alternatively, server 240 undertakes secondary computing work and first device 220 and second device 260 undertakes primary computing work; alternatively, the server 240, the first device 220, and the second device 260 perform cooperative computing by using a distributed computing architecture.
The second device 260 is installed and operated with an application program supporting a virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, an FPS game, an MOBA game, and a multi-player gunfight type live game. The second device 260 is a device used by a second user who uses the second device 260 to control a second virtual object located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as a simulated character or an animated character.
Optionally, the first virtual character and the second virtual character are in the same virtual environment. Alternatively, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first virtual character and the second virtual character may belong to different teams, different organizations, or two groups with enemy.
Alternatively, the applications installed on the first device 220 and the second device 260 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 220 may generally refer to one of a plurality of devices, and the second device 260 may generally refer to one of a plurality of devices, and this embodiment is only exemplified by the first device 220 and the second device 260. The first device 220 and the second device 260 may be of the same or different device types, including: at least one of a game console, a desktop computer, a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated where the device is a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or fewer. For example, the number of the devices may be only one, or several tens or hundreds, or more. The number and the type of the devices are not limited in the embodiments of the present application.
It should be noted that the server 240 may be implemented as a physical server, or may also be implemented as a Cloud server, where Cloud technology (Cloud technology) refers to a hosting technology for unifying serial resources such as hardware, software, and network in a wide area network or a local area network to implement calculation, storage, processing, and sharing of data. The cloud technology is based on the general names of network technology, information technology, integration technology, management platform technology, application technology and the like applied in the cloud computing business model, can form a resource pool, is used as required, and is flexible and convenient. Cloud computing technology will become an important support. Background services of the technical network system require a large amount of computing and storage resources, such as video websites, picture-like websites and more web portals. With the high development and application of the internet industry, each article may have its own identification mark and needs to be transmitted to a background system for logic processing, data in different levels are processed separately, and various industrial data need strong system background support and can only be realized through cloud computing.
In some embodiments, the method provided by the embodiment of the application can be applied to a cloud game scene, so that the data logic calculation in the game process is completed through the cloud server, and the terminal is responsible for displaying the game interface.
In some embodiments, the server 240 may also be implemented as a node in a blockchain system. The Blockchain (Blockchain) is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. The block chain, which is essentially a decentralized database, is a string of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, which is used for verifying the validity (anti-counterfeiting) of the information and generating a next block. The blockchain may include a blockchain underlying platform, a platform product services layer, and an application services layer.
With reference to the above noun introduction and application scenario, a method for interaction between virtual objects provided in the present application is described, and for example, the method is applied to a terminal, as shown in fig. 3, the method includes the following steps.
Step 310, displaying a first virtual object and a second virtual object in a virtual scene.
Optionally, the virtual scene is a scene displayed when the application program runs on the terminal device. The virtual scene may be a simulated environment of the real world, such as: the virtual scene is a scene obtained by simulating the real world on the basis of the real world construction; or a semi-simulated semi-fictional scene, for example: the virtual scene comprises scenes (such as traffic, roads and the like) existing in the real world and fictional scenes (such as islands A, volcanoes B and the like which do not exist in the real world); it may also be a purely fictional scene, such as: scenes, things and the like in the virtual scene do not have corresponding existence in the real world (such as monsters, queens and the like).
Optionally, the virtual scene is at least one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene. In the process of loading the virtual scene, there may be a process of converting a two-dimensional virtual scene into a three-dimensional virtual scene, or there may be a process of switching a three-dimensional virtual scene into a two-dimensional virtual scene, and the like. Illustratively, the virtual scene includes sky, land, sea, etc., the land may include environmental elements such as grassland, desert, city, etc., and the user may control the virtual object to move in the virtual scene.
Alternatively, a virtual object refers to the image of various people and things that can interact in a virtual scene, or an object that is movable in a virtual scene. The movable object may be a virtual character, a virtual animal, an animation character, or the like. The virtual object may be an avatar in the virtual environment that is virtual to represent a player. Illustratively, the virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual field. For example: the virtual objects are monsters, world boss, and the like.
Optionally, the virtual environment includes a plurality of virtual objects, and the role types of different virtual objects in the virtual environment may be the same or different.
Illustratively, in a game of the type of the attack-avoidance game, the virtual objects are 50 virtual characters respectively controlled by 50 players, and the character types of the 50 virtual characters in the game of the type of the attack-avoidance game are the same, for example: the type of character of the virtual character in the game is a soldier who needs to collect various resources in the game and use them against other player-controlled virtual characters to win the game.
Alternatively, in a scenario-like game, the virtual objects include both virtual characters controlled by a player and virtual characters configured in the game and not manipulated by a real player, such as: Non-Player Character (NPC). The character types of the virtual character and the virtual character in the drama game are different, for example: the type of the virtual character in the game is a leader, the role of the leader is to guide the virtual character to a destination point or to complete a destination task, the type of the virtual character is a participant, and the role of the participant is to complete a task of assigning a game, to save experience values, and the like.
The above description is only exemplary, and the present invention is not limited to the above description.
In an alternative embodiment, the virtual scene includes a first virtual object and a second virtual object, and the first virtual object and the second virtual object have different roles in the game.
The first virtual object is used for carrying out periodic prompt in the target interaction event, and the second virtual object is a virtual object participating in the target interaction event.
Optionally, the virtual scene corresponds to a scene in a game, and the game is an interactive game; alternatively, the game may be built in with an interactive mini-game or the like. Illustratively, the virtual scene is a scene corresponding to an interactive mini game, and the target interactive event is an interactive form in the interactive mini game, and is used for instructing different virtual objects to interact with each other, where the different virtual objects have a certain influence on the result of the target interactive event. Illustratively, the a virtual object and the B virtual object participate in the target interaction event, and during the proceeding process of the target interaction event, both the motion of the a virtual object and the motion of the B virtual object may affect the target interaction event, for example: the action of the A virtual object or the action of the B virtual object causes the target interaction event to end.
The staging prompt is used to indicate a staging profile issued by the first virtual object. Optionally, the episodic cue comprises at least one of the following forms.
(1) The staged prompts are action prompts.
Optionally, the first virtual object is an NPC, and in the target interaction event process, the first virtual object performs a turn-around action, and the corresponding staged prompt is an angle of the turn-around of the first virtual object. For example: the first virtual object "turns" 180 ° in magnitude during a phase of a target interaction event, wherein the direction of the turn may be either clockwise or counter-clockwise. Illustratively, the first virtual object performs a "turn" action in a clockwise direction, the angular velocity of the "turn" action being 30 ° per second, for indicating the phase velocity or phase duration in the target interaction event.
(2) The staged prompts are voice prompts.
Optionally, the first virtual object is a player-controlled virtual object, and the staged cue of the first virtual object is a "special sound effect" during the target interaction event. For example: the first virtual object plays the role of 'king' during one target interaction event, the 'king' role is configured with 'special sound effects' (such as 'biting' sound, music sound and the like), and the 'special sound effects' are used as stage prompt information.
Optionally, the first virtual object is an NPC, and the second virtual object is a player-controlled character in the course of a target interaction event with the NPC; alternatively, the first virtual object and the second virtual object are both player-controlled characters during the target interaction event. The above description is only exemplary, and the present invention is not limited to the above description.
In step 320, a first control operation for a second virtual object is received in a first phase of the target interaction event.
Illustratively, the target interaction event includes a plurality of phases, and the first virtual object and the second virtual object have the same or different operations in different phases. Optionally, in the first stage of the target interaction event, the player performs a first control operation on the second virtual object, where the first control operation is used to control the second virtual object to execute an interaction action corresponding to the target interaction event, where the interaction action includes at least one of the following forms.
(1) The interactive motion is used for performing the designated motion.
Illustratively, the second virtual object is a player-controlled virtual object, and the first control operation is an action performed by the player in controlling the second virtual object, which may be an action such as walking or running performed on land; or may be an action such as swimming or underwater swimming; but also actions such as flying, parachuting, etc. which are carried out in the air. The specified action may be differentiated according to a difference in position of the second virtual object in the virtual scene. For example: at some point, the second virtual object is located on the land of the virtual scene, at which point the player may control the second virtual object to run forward on the land of the virtual scene.
Optionally, the player may perform a control operation on the second virtual object by sliding on a screen corresponding to the terminal; alternatively, the player may implement the control operation on the second virtual object through a peripheral device configured at a terminal such as a joystick or a keyboard.
(2) The interactive action is to complete a specified action.
Illustratively, the second virtual object is a player-controlled virtual object, and the first control operation is a player-controlled second virtual object performing a specified action. For example: the designated action is a gesture action image displayed in the virtual scene, and the player needs to control the second virtual object to take a corresponding gesture according to the gesture action image so as to achieve the purpose of completing the designated action.
Optionally, in the first phase of the target interaction event, the first virtual object is in a static state, for example: the first virtual object is an NPC character, the second virtual object is a virtual object controlled by a player, the NPC character is static in the first stage of the target interaction event, and the second virtual object carries out corresponding action based on the first control operation of the player.
Step 330, displaying the view angle turning action of the first virtual object at the second stage of the target interaction event.
Illustratively, the first virtual object is an NPC character, and in a first phase of the target interaction event, the first virtual object is in a static state, and in a second phase of the target interaction event, the first virtual object starts to move. Optionally, the movement performed by the second virtual object is a "turn-around" movement, and the perspective of the second virtual object rotates as the second virtual object performs the "turn-around" movement.
The viewing angle is used to indicate the angular range of the received image. Optionally, in this embodiment of the present application, the viewing angle is used to indicate an angle range in which the first virtual object can receive the image in the virtual scene. Illustratively, the viewing angle range of the first virtual object is 100 ° for indicating that the first virtual object can receive a scene within 100 ° of the virtual scene, for example: the angle from the leftmost end of the line of sight to the rightmost end of the line of sight at which the first virtual object can receive the scene is 100 ° at the maximum, with the head of the first virtual object facing straight ahead.
Optionally, in the target interaction event, the view angle range of the first virtual object is determined, and the coverage area of the view angle range of the first virtual object correspondingly rotates along with the rotation of the head of the first virtual object; or the coverage area of the visual angle range of the first virtual object rotates correspondingly with the rotation of the body of the first virtual object. Illustratively, as shown in fig. 4, the first virtual object is a wooden person 410, and the wooden person 410 can realize rotation in a virtual scene, where the rotation of the wooden person 410 includes both body rotation and head rotation. For example: the head of the wooden person 410 can realize the rotation action of rotating 180 degrees clockwise or rotating 180 degrees anticlockwise; alternatively, the body of the wooden person 310 may implement a rotational motion of rotating 360 ° clockwise or 360 ° counterclockwise.
Optionally, the wooden person 410 may rotate at a constant speed or at a variable speed when performing the rotating motion. For example: when the wooden person 410 performs the rotating operation, the wooden person rotates 180 ° clockwise at an angular velocity of 10 °/s; alternatively, when the wooden person 410 performs the turning operation, the wooden person makes a 360 ° counterclockwise turn at an initial speed of 5 °/s and an acceleration of 2 °/s.
The above description is only exemplary, and the present invention is not limited to the above description.
Optionally, the perspective steering action is used to adjust the viewing line of sight range of the first virtual object.
And the observation sight range of the first virtual object correspondingly changes along with the visual angle turning action. Schematically, as shown in fig. 4, the line of sight orientation 420 of the first virtual object is directly in front, wherein the line of sight orientation 420 is indicated by an arrow. The viewing gaze range 430 of the first virtual object is used to indicate the coverage area of the first virtual object's viewing angle range.
Optionally, in the target interaction event, with the gaze direction 420 as the center line of the viewing angle range of the first virtual object, the head direction of the wooden person 410 is the same as the gaze direction 420 of the first virtual object, for example: the coverage area of the first virtual object view angle range is 60 ° to the left and 60 ° to the right of the line of sight 420, with the line of sight 420 (head orientation) as the center.
Optionally, the perspective steering action of the first virtual object is performed as the head of the first virtual object rotates (e.g., the gaze direction 420 is the same as the head direction). Illustratively, the head of the first virtual object rotates clockwise, and the angle-of-view turning action of the second virtual object rotates clockwise. Optionally, the magnitude of the perspective turning motion of the first virtual object is the same as the magnitude of the head rotation of the first virtual object. Illustratively, the first virtual object head rotates clockwise by 30 degrees, and the second virtual object has a view angle turning motion of 30 degrees.
Illustratively, the direction of the line of sight 320 of the first virtual object is the same as the direction of the head of the first virtual object, and when the head of the first virtual object is rotated 90 degrees clockwise, as shown in fig. 5, the direction of the line of sight 510 of the first virtual object is also rotated 90 degrees clockwise, and the observation line of sight range 520 is also rotated 90 degrees accordingly.
Illustratively, the viewing range of the first virtual object is determined according to the viewing angle turning motion, and when the specified viewing range needs to be determined, the viewing angle turning motion of the first virtual object is adjusted. For example: when the designated viewing line-of-sight range is selected as the designated region 530 in fig. 5, the angle-of-view rotation of the first virtual object is rotated as 180 ° counterclockwise or 180 ° clockwise. The above description is only exemplary, and the present invention is not limited to the above description.
Step 340, displaying the staged event execution result of the second virtual object in the second stage based on the interactive action execution situation between the real-time observation sight line range of the first virtual object and the second virtual object in the second stage.
Optionally, the viewing line of sight of the first virtual object is varied in real time, i.e.: the observation sight range of the first virtual object can cover different areas in the virtual scene according to different time. Illustratively, the speed of change of the observation visual line range of the first virtual object includes at least one of the following modes.
(1) The observation sight range of the first virtual object is changed at a constant speed.
Optionally, the first virtual object performs a "turn-around" motion in the virtual scene, and the magnitude of the "turn-around" motion is fixed over the same time interval. For example: the first virtual object turns "10 ° per second, indicating that the first virtual object changes 10 ° in 1 second intervals, such as the first second interval, the second interval, and so on. The viewing gaze range of the first virtual object changes with "turn around" motion, such as: the area covered by the observation visual line range of the first virtual object at the initial time (0 th second) is the first quadrant, and the coverage at the 9 th second time is the second quadrant.
(2) The observation sight range of the first virtual object is not changed at a constant speed.
Optionally, the first virtual object performs a "turn" motion in the virtual scene, the magnitude of the "turn" motion being different in the same time interval and including both uniform acceleration motion and variable acceleration motion, and the like, wherein the acceleration includes both positive and negative numbers. For example: the first virtual object performs a "turn" motion at an acceleration of 10 ° per second for indicating that the first virtual object has a "turn" amplitude of 10 ° at the first second time interval, a "turn" amplitude of 20 ° at the second time interval, and the like, and the observation line-of-sight range of the first virtual object changes as the "turn" motion proceeds, and the change of the observation line-of-sight range of the first virtual object is gradually accelerated. Optionally, the first virtual object 'turn' amplitude irregularly changes from slow to fast or from fast to slow, and also meets the requirement of non-uniform change.
It should be noted that the above is only an illustrative example, and the present invention is not limited to this.
Illustratively, the angle of view of the first virtual object is 120 °, and in the second stage of the target interaction event, the first virtual object performs a clockwise rotation within a rotation range of 180 ° at a speed of 30 °/s. As shown in fig. 6, at the 0 th second, the viewing direction of the first virtual object is point a, and the viewing range of the first virtual object is a first area 610; at the 3 rd second, the sight line orientation of the first virtual object is point B, and the observation sight line range of the first virtual object is the second area 620; at the 6 th second, the viewing direction of the first virtual object is point C, and the viewing range of the first virtual object is the third area 630. Wherein, the angle of the observation line of sight range corresponding to the first area 610, the second area 620 and the third area 630 is 120 degrees which is the visual angle of the first virtual object. The process from the first area 610 to the second area 620, and then from the second area 620 to the third area 630 is a real-time change process of the observation visual line range of the first virtual object, and the change process corresponds to time.
Optionally, when the first virtual object is rotated to a preset maximum angle, for example: at the point C, the first virtual object may stop at the position corresponding to the point C (accordingly, the observation line of sight of the first virtual object stops at the third area 630), and may rotate counterclockwise to the initial position again at the same speed. For example: the image is rotated at a speed of 30 °/s from the position corresponding to the point C to the position corresponding to the point a (accordingly, the observation line of sight of the first virtual object is rotated from the third region 630 to the second region 620).
It should be noted that the above is merely an illustrative example of the real-time observation visual range, and the present application embodiment does not limit the direction, angle, area, etc. of the observation visual range in the real-time process.
Optionally, in a first stage of the target interaction event, the second virtual object moves in the virtual scene, when the second stage of the target interaction event is entered, the first virtual object starts to perform a view angle turning action, and the second virtual object needs to adjust a moving manner according to a view angle turning motion condition of the first virtual object in the second stage. Wherein, the visual angle turning motion situation of the first virtual object comprises a rotation direction situation, a rotation speed situation and the like.
Optionally, in the second stage of the target interaction event, the first virtual object completes a clockwise 180 ° rotation process within 1 second, and the clockwise 180 ° rotation process is the perspective turning action of the first virtual object. The second virtual object needs to adjust the moving mode according to the view angle turning motion condition of the first virtual object at the second stage.
Optionally, the game is configured such that the second virtual object stops moving after the alert tone of the first stage stops, and otherwise, the second virtual object is regarded as being eliminated.
Illustratively, in a first phase of the target interaction event, the player may control the second virtual object to move in the virtual scene, and in the first phase, there is a cue tone in the virtual scene, such as: music, alarm sounds, etc. Optionally, the stop of the cue tone means that the first stage is ended, the second stage is started, the first virtual object starts the view angle turning motion, and the player needs to control the second virtual object to stop the moving process in the virtual scene; alternatively, the stop of the alert tone means that the first stage is ended, but the second stage in which the first virtual object starts the view angle turning motion is not yet started, but the player needs to control the movement process of the second virtual object in the virtual scene to stop according to the alert tone, and the like. Namely: whether the second virtual object is located within the visual angle range of the first virtual object or not, when the cue tone stops, the second virtual object needs to stop the movement action.
Optionally, the game is arranged such that the second virtual object stops moving after the start of the second stage alert tone, and is considered to be eliminated otherwise.
Illustratively, the player may control the second virtual object to move in the virtual scene during a first stage of the target interaction event, during which no alert tone is present in the virtual scene (including no background tone or background tone being a non-alert tone). Optionally, in the second stage, a warning tone is played in the virtual scene, where the warning tone is played to indicate that the target interaction event starts the second stage, the first virtual object starts the turning motion of the viewing angle, and the player needs to control the second virtual object to stop the moving process in the virtual scene; alternatively, the cue tone playing means that the first virtual object is about to start the view turning motion, and the second virtual object needs to stop the moving action within the cue tone, for example: the warning tone is a beep sound with the duration of 1 second, after the warning tone is finished, the first virtual object starts the visual angle turning motion, and the second virtual object should stop the moving process in the virtual scene, namely: whether the second virtual object is located within the visual angle range of the first virtual object or not, when the cue tone stops, the second virtual object needs to stop the movement action.
Optionally, the game is set to be within the visual range of the first virtual object, and after the visual angle turning action of the visual line of the first virtual object is completed, the second virtual object stops moving, and if not, the second virtual object is regarded as being eliminated.
Schematically, as shown in fig. 4, the first virtual object is a wooden person 410, the direction of the line of sight of the wooden person 410 is taken as the front direction, the second virtual object is a plurality of virtual characters (controlled to move by the player) located behind the wooden person 410, and the plurality of virtual characters located in the X area 440 need to stop moving according to the request of the target interaction event based on the request of the target interaction event. If the rotation direction of the wooden person 410 is clockwise rotation, the time for stopping the movement of the virtual characters located in the left area is slightly longer than the time for stopping the movement of the virtual characters located in the right area, that is: the multiple virtual characters in the left area have a longer adjustment time for stopping movement than the multiple virtual characters in the right area. The left area and the right area are divided into areas in the X area 440 in a relative manner.
In an optional embodiment, in response to the second virtual object being within the real-time viewing line of sight of the first virtual object in the second stage, determining the execution of the interactive action by the second virtual object; and displaying the execution result of the staged event of the second virtual object in the second stage based on the execution condition of the interactive action.
Optionally, based on the result of the second virtual object adjusting the moving manner, the execution condition of the interaction action of the second virtual object may be determined. Illustratively, the execution of the interaction action of the second virtual object is the execution of the target interaction event by the first virtual object.
Optionally, after determining the execution condition of the interaction action between the real-time observation view range of the first virtual object and the second virtual object in the second stage, displaying a periodic event execution result of the second virtual object in the second stage. Wherein the result of the execution of the staged event includes at least one of the following situations.
(1) The result of the execution of the staged event is successful execution.
In an alternative embodiment, the staged success of the second virtual object in the second stage is displayed in response to the second virtual object pausing execution of the interactive action within the real-time viewing line of sight of the first virtual object.
Illustratively, as shown in fig. 7, after the wooden person 710 rotates clockwise by a certain angle, a plurality of virtual characters 720 in a stopped movement state within the observation visual line of the wooden person 710 may be regarded as the success of the staged event; a plurality of virtual characters 730 located outside the range of the sight of the wooden person 710 need to be kept in a stationary state while the range of the sight of the wooden person 710 is moved to the area where the person is located. If the wooden person rotates clockwise by 180 degrees and stops, the virtual character in the stop movement state in the observation visual line range of the wooden person and the virtual character in the stop movement state out of the observation visual line range of the wooden person are regarded as the successful execution of the stage event.
Alternatively, when the wooden person 710 rotates clockwise by a certain angle, the virtual characters 730 outside the range of the sight of the wooden person 710 may continue to move, but should be ready to stop moving at any time. If the wooden person rotates clockwise by 180 degrees and stops, the virtual character in the stop movement state in the observation visual line range of the wooden person and the virtual character in the stop movement state out of the observation visual line range of the wooden person are regarded as the successful execution of the stage event.
It should be noted that the above is only an illustrative example, and the present invention is not limited to this.
(2) The result of the execution of the staged event is an execution failure.
In an alternative embodiment, the result of the stage-wise failure of the second virtual object in the second stage is displayed in response to the second virtual object continuing to perform the interactive action within the real-time viewing line of sight of the first virtual object.
Illustratively, as shown in fig. 7, after the wooden person 710 rotates clockwise by a certain angle, the virtual character 710 which is located within the observation visual range of the wooden person 710 but does not stop moving may be regarded as a periodic event execution failure; if the virtual characters 730 outside the observation visual range of the wooden person 710 are still in a moving state when the observation visual range of the wooden person 710 moves to the area where the virtual characters are located, the virtual characters are regarded as failed in the execution of the staged event.
Alternatively, after the wooden person 710 rotates clockwise by a certain angle, the virtual character 720 outside the observation visual line of the wooden person 710 may continue to move, but should be ready to stop moving at any time, and if the wooden person 710 stops after rotating clockwise by 180 °, the virtual character within the observation visual line of the wooden person and exhibiting the moving state, and the virtual character outside the observation visual line of the wooden person and exhibiting the moving state are regarded as the execution failure of the staged event. Alternatively, a second virtual object that failed the execution of a staged event may be considered a knockout.
The above description is only exemplary, and the present invention is not limited to the above description.
In an alternative embodiment, in the second phase, a second control operation on a second virtual object is received.
Illustratively, in the second stage of the target interaction event, the second virtual object needs to adjust the moving manner according to the view turning motion condition of the first virtual object in the second stage. And taking the adjustment mode as a second control operation, wherein the purpose of performing the second control operation on the second virtual object is to control the second virtual object to suspend the execution of the interactive action and keep balance.
Illustratively, the interactive action is a motion action (e.g., walking, running, etc.) of the second virtual object, and performing the second control operation on the second virtual object is used for instructing the second virtual object to stop the ongoing motion action. Optionally, based on differences in the manner, the magnitude, and the like of the interactive motion, when the second virtual object is controlled to pause the interactive motion, the difficulty of pausing the interactive motion by the second virtual object is different.
For example: the interaction action of the second virtual object is slow walking, when the second virtual object is controlled to pause to execute the interaction action, the second virtual object needs to stop slow walking, and the action amplitude based on the interaction action is smaller, so that the difficulty of pausing the interaction action by the second virtual object is lower; or the interactive action of the second virtual object is sprint running, when the second virtual object is controlled to pause to execute the interactive action, the second virtual object needs to stop the sprint running action, and the action amplitude based on the interactive action is larger, so that the difficulty of pausing the interactive action by the second virtual object is higher. The above description is only exemplary, and the present invention is not limited to the above description.
In summary, when the target interaction event is performed, the second virtual object performs an interaction action based on the first control operation, and adjusts the interaction action according to the staged prompt provided by the first virtual object, so as to finally determine the staged execution result of the second virtual object. By the method, when the second virtual object carries out the target interaction event, the interaction action in the virtual scene is required to be carried out based on the first control operation, the interaction action is required to be adjusted according to the observation sight range of the first virtual object, and the execution condition of the interaction action is displayed based on the adjustment result of the interaction action and the action state of the second virtual object, so that the stage execution result of the second virtual object is determined, the game result is prevented from being judged only by depending on a single influence factor of the visual angle turning action of the first virtual object in the game process, the influence factor of game success or failure and the competitive excitation point are increased, and the interest of the game is effectively improved.
In an alternative embodiment, in the second stage of the target interaction event, a certain control operation needs to be performed on the second virtual object according to the view turning action of the first virtual object. Illustratively, as shown in fig. 8, the second stage of the target interaction event further includes the following steps 810 to 840.
In step 810, in the second stage, a balance control screen is displayed.
Optionally, in a second stage of the target interaction event, the first virtual object perspective turning action is displayed. For example: the first virtual object is an NPC character configured in a game, the second virtual object is a virtual character controlled by a player, in the second stage of the target interaction event, the first virtual object performs a turning action, and the game target of the player controls the second virtual object to be in a static state within the visual angle range of the first virtual object. Based on the game goal, the player needs to control the second virtual object to adjust the interactive action at the first stage. Optionally, according to the action difference of the interactive action of the second virtual object, the difficulty of the second virtual object in achieving the game target also differs.
In an alternative embodiment, a pause control operation is received in the second phase. The pause control operation is used for controlling the second virtual object to pause the execution of the interactive action.
Illustratively, the interactive action of the second virtual object is sprint running, when the second virtual object is controlled to pause to execute the interactive action, the second virtual object needs to stop the sprint running action, and the difficulty of pausing the interactive action by the second virtual object is higher because the action amplitude of the interactive action is larger. When the interactive action is paused, the player can control the balance of the second virtual object to realize the purpose of making the second virtual object in a paused state.
In an alternative embodiment, the staged failure result of the second virtual object in the second stage is determined in response to the second virtual object not receiving the pause control operation within the real-time viewing line of sight of the first virtual object.
Illustratively, after the first virtual object performs the "turning" motion, the observation realization range of the first virtual object changes along with the "turning" motion of the first virtual object. And when the second virtual object still performs the interactive action in the real-time observation realization range of the first virtual object, determining the stage failure result of the second virtual object in the second stage, wherein the stage failure result comprises that the second virtual object is eliminated. Or when the second virtual object pauses the interactive action within the real-time observation implementation range of the first virtual object, but the second virtual object is not in a balanced state, determining a stage failure result of the second virtual object in the second stage, wherein the stage failure result includes that the second virtual object is eliminated.
In an alternative embodiment, in response to the second virtual object receiving the pause control operation within the real-time viewing gaze of the first virtual object, the balance control screen is displayed based on the pause control operation.
Optionally, in response to receiving an instruction to pause the second virtual object interaction action, a balance control screen is displayed. For example: after the first virtual object begins the "turn around" action, the player controls the interactive action of the second virtual object to pause the run. Illustratively, the player executes the target interaction event through the mobile phone terminal, the process of controlling the second virtual object to run is realized by sliding a direction control on the mobile phone screen, and when the player needs to control the second virtual object to pause the running action, the pause action is realized by leaving a finger away from the mobile phone screen; or, the player executes the target interaction event through the television terminal, the process of controlling the second virtual object to run is realized by pushing a direction control on a game handle connected with the television terminal, and when the player needs to control the second virtual object to pause the running action, the pause action is realized by stopping the control operation on the game handle, and the like. And after the mobile phone terminal, the television terminal and the like receive the instruction of suspending the interactive action, displaying a balance control picture on a screen corresponding to the mobile phone terminal or a screen corresponding to the television terminal. Wherein, the balance control picture comprises a balance control display element.
Illustratively, the balance control display element is used to assist the second avatar in entering a balanced state. For example, the user adjusts and determines the balance state of the second virtual object according to the balance control display element in the balance control screen.
In an alternative embodiment, the balance control display element in the balance control screen is determined based on the second virtual object displaying the executed phase of the interactive action when the interactive action is paused.
Illustratively, when the first virtual object adjusts the observation sight range, the player controls the second virtual object to pause the interactive action, such as: and (3) pausing the running action, stopping running of the second virtual object in the virtual scene based on the pausing operation, namely displaying the executed stage of the interactive action, and optionally presenting balance control display elements in different conditions in the balance control screen according to the state of the second virtual object when the pausing operation is performed on the interactive action by the second virtual object, wherein the balance control display elements have corresponding relations with the executed stage. There may be a difference in the display of the balance control display element in the balance control screen according to the state of the second virtual object, and optionally at least one of the following is included between the state of the second virtual object and the display of the balance control display element.
(1) Different balance control display elements are displayed according to the posture state of the second virtual object.
Optionally, when the interactive action mode of the second virtual object is complex or the action amplitude is large, the difficulty of controlling the second virtual object to enter the equilibrium state is large; when the interactive action mode of the second virtual object is single or the action amplitude is small, the difficulty of controlling the second virtual object to enter the equilibrium state is smaller. Illustratively, when the interactive action is a running action, as shown in fig. 9, the gesture state when the interactive action (running action) is paused for the second virtual object. The balance difficulty coefficient is increased from left to right. For example: the first pose 910 is used to indicate that the second virtual object is in a running motion state with one foot standing; a second pose 920 is used to indicate that the second virtual object is in a state of motion with a single foot slightly raised; the third gesture 930 is used to indicate that the second virtual object is in a state of motion where a single foot is slightly stepped; the fourth gesture 940 is used to indicate that the second virtual object is in a state of motion with a large step-out of one foot; the fifth gesture 950 is used to indicate that the second virtual object is in a state of motion in which a single foot steps greatly, and the like. Accordingly, the balance control screen displays the balance control display elements matching the difficulty according to the difficulty difference of the second virtual object entering the balance state. For example: the balance control display elements are displayed in a control form, the greater the difficulty of the second virtual object entering a balance state is, the greater the number of the balance control display elements is, and a player needs to completely trigger the content of each balance control display element in a specified time; alternatively, the greater the difficulty of the second virtual object entering the equilibrium state, the faster the rate of appearance of the balance control display elements, the need for the player not to miss any of the balance control display elements, and so forth.
(2) And displaying different balance control display elements according to the position state of the second virtual object.
Illustratively, different display states of the balance control display element are determined according to the position relation between the second virtual object and the first virtual object. For example: when the position of the second virtual object is approximately close to the real-time observation sight range of the first virtual object, the shorter the time for controlling the second virtual object to enter the balanced state, the greater the difficulty. Accordingly, according to the time difference at which the second virtual object enters the equilibrium state, the equilibrium control screen displays the equilibrium control display elements matching the difficulty of the time difference. For example: the balance control display element is displayed in a pointer mode, the shorter the time for the second virtual object to enter a balance state is, the smaller the range of the balance control display element is, and a player needs to perform certain balance adjustment within the range of the balance control display element; alternatively, the shorter the time for the second virtual object to enter the equilibrium state, the higher the sensitivity of the balance control display element, and the player is required to be held as smoothly as possible within the range of the balance control display element.
It should be noted that the above is only an illustrative example, and the manner of the display difference of the balance control display elements may be applied separately, for example: performing differential division only according to the posture state of the second virtual object or the position state of the second virtual object; also applicable are, for example: the posture state of the second virtual object and the position state of the second virtual object are combined to perform differential division on the display of the balance control display element, which is not limited in the embodiment of the present application.
In step 820, a second control operation is received on the balance control screen.
And the second control operation is used for controlling the element display result corresponding to the balance control display element.
Optionally, the difficulty of performing the second control operation differs according to the difference of the balance control display elements. When the difficulty of controlling the second virtual object to enter the equilibrium state is high, the balance control display elements displayed on the balance control picture are more and appear faster, and the higher the difficulty of controlling the virtual object by the player through the second control operation is, the more complicated the execution mode is.
Illustratively, the balance control display element displayed on the balance control screen is a balance control, and the player needs to perform a trigger operation on the balance control according to the appearance order of the balance control. For example: as shown in fig. 9, when the second virtual object is in the first posture 910, the number of balance controls displayed on the balance control screen is small, the display and disappearance speeds of the balance controls are slow, and the player has more time to trigger the balance controls, so as to achieve the purpose of controlling the balance of the second virtual object; when the second virtual object is in the fifth pose 950, the number of balance controls displayed on the balance control screen is large, the display and disappearance speeds of the balance controls are fast, the time for the player to trigger the balance controls is short, and the balance of the second virtual object can be controlled by highly concentrated and fast reaction force.
And determining an element display result corresponding to the balance control display element based on the balance control display element and the second control operation. Wherein the element display result corresponds to a balance of the second virtual object.
Illustratively, the balance control display element is a balance control, the element display result is a trigger result for the balance control, and when the trigger condition of the player for the balance control reaches the standard specified by the game, the requirement for controlling the balance of the second virtual object is considered to be met. For example: the game stipulated standard is that all balance controls are clicked according to the sequence of the balance controls within a certain time, and when a player performs effective clicking operation on all the balance controls according to the sequence within the stipulated time, the requirement for controlling the balance of the second virtual object is considered to be met, namely, the second virtual object enters a balance state in the virtual scene; or, when the player does not perform effective click operations on all the balance controls in sequence (such as missing clicks, not clicking in sequence, and the like) within the specified time, it is considered that the requirement for controlling the balance of the second virtual object is not met, that is, the balance control of the second virtual object in the virtual scene fails. It should be noted that the above is merely an illustrative example, and the balance control display elements may exist in various forms in different balance control screens, which is not limited in the embodiments of the present application.
In summary, when the target interaction event is performed, the second virtual object performs an interaction action based on the first control operation, and adjusts the interaction action according to the staged prompt provided by the first virtual object, so as to finally determine the staged execution result of the second virtual object. By the method, when the second virtual object carries out the target interaction event, the staged execution result needs to be determined based on the adjustment result of the interaction action and the action state of the second virtual object, so that the interest of the game is effectively improved.
In the method provided by the embodiment of the application, in the process of controlling the second virtual object to play the game, the player not only considers the influence of the observation visual range of the first virtual object on the game result, but also considers the influence of the self-balance factor of the second virtual object on the game result. For example: and after the second virtual object is controlled to stop the interactive action, a balance control picture is displayed, a balance control display element in the balance control picture is related to the balance difficulty coefficient of the second virtual object, and a game result is obtained based on the second control operation on the balance control picture, so that the judgment factor of the game result is increased, and the entertainment of the game is effectively improved.
In an alternative embodiment, the balance control screen includes at least one of the following forms depending on the display mode. The balance control screen includes: (one) control distribution picture; (II) a pointer control picture; and (III) moving the control picture. Different balance control pictures correspond to different second control operations. Schematically, as shown in fig. 10, the second control operations corresponding to the different balance control screens are different. The second control operation corresponding to the control distribution screen may be implemented as the following steps 1010 to 1011; the second control operation corresponding to the pointer control screen may be implemented as the following steps 1020 to 1021; the second control operation corresponding to the pointer control screen may be implemented as step 1030 to step 1031 as follows.
Control distribution picture
Step 1010, displaying a control distribution picture.
Optionally, the control distribution frame includes a plurality of balance controls, and the player controls the balance of the second virtual object in the virtual scene by performing a trigger operation on the balance controls in the control distribution frame.
The balance controls may be represented by the same size, shape, color, or the like, or may be represented by different sizes, shapes, colors, or the like. Optionally, the control distribution screen includes at least one balance control distributed in sequence as a balance control display element. Wherein the sequential distribution of balance controls includes at least one of the following.
(1) The order of the balance controls is determined according to the identification on the balance controls.
Optionally, in the control distribution screen, different balancing controls are distinguished by an identifier on the control. For example: the marks of the balance controls are English letters, each balance control is marked with one English letter, and the sequence of the English letters is from a to z; or the marks of the balance controls are colors, each balance control adopts a color difference, and the color sequence is red, orange, yellow, green, blue and purple; or the balance controls are marked with Arabic numerals, each balance control is marked with the Arabic numerals, and the sequence of the numerals is the sequence of the Arabic numerals.
Schematically, as shown in fig. 11, a schematic diagram of a control distribution screen is shown. The balance controls are the controls with the same size and shape, and different balance controls are distinguished according to different Arabic numerals marked on the balance controls. Optionally, the order of the arabic numerals marked on the balance control is used to indicate the order of the balance control, for example: balance control 1110 numbered 1 is used to indicate a first balance control, balance control 1120 numbered 2 is used to indicate a second balance control, balance control 1130 numbered 3 is used to indicate a third balance control, and so on.
(2) And determining the sequence of the balance controls according to the displayed sequence of the balance controls.
Optionally, in the control distribution screen, the style (size, shape, etc.) of the balance control is not determined, for example: among the plurality of rendered balance controls are circles, squares, triangles, irregular shapes, and the like. Illustratively, at most 1 balance control is displayed on the control distribution picture, the styles of the balance controls are random, the first balance control displayed on the control distribution picture is used for indicating the first balance control, and after the first balance control disappears, the second balance control displayed on the control distribution picture is used for indicating the second balance control. Or at most 2 balance controls are displayed on the control distribution picture, the styles of the balance controls are circular controls with the same size, the first balance control displayed on the control distribution picture is used for indicating the first balance control, and when the first balance control is not eliminated on the control distribution picture, the second balance control is displayed on other areas of the control distribution picture. Optionally, the order in which the balance controls are displayed is the same as the order in which the balance controls disappear, for example: and a first balance control and a second balance control are displayed on the control distribution picture, the first balance control disappears firstly when a third balance control is displayed, and the second balance control disappears again when a fourth balance control is displayed.
The above description is only exemplary, and the present invention is not limited to the above description.
And step 1011, receiving a sequential clicking operation on at least one balance control as a second control operation.
Optionally, the trigger operation on the balance control is realized by clicking. For example: and a player carries out a target interaction event through the mobile phone terminal, displays a control distribution picture on a screen of the mobile phone terminal, and realizes the triggering operation of the balance control by clicking the balance control in the control distribution picture. Illustratively, after the balance control is clicked, the balance control disappears (such as slowly disappearing, suddenly disappearing, and the like) in the control distribution picture; alternatively, the balance control changes from color to gray (e.g., green to gray, etc.) in the control distribution screen.
And after the sequence of at least one balance control is determined, clicking the balance control according to the sequence of the balance control is used as a second control operation.
Illustratively, as shown in fig. 11, the order of the balance controls is determined according to different arabic numerals marked on the balance controls. For example: balance control 1110, numbered 1, is used to indicate a first balance control, balance control 1120, numbered 2, is used to indicate a second balance control, and so on. Clicking the balance control according to the sequence of the Arabic numerals displayed on the balance control, for example: the different balance controls are clicked in the order "balance control 1110-balance control 1120-balance control 1130-balance control 1140-balance control 1150".
Optionally, after the balance control is triggered, the balance control disappears on the control distribution picture; in a state where the balance control is not triggered, the balance control may also automatically disappear on the control distribution screen, and illustratively, the manner in which the balance control is displayed and disappears in the control distribution screen includes at least one of the following.
(1) And the balance control is displayed on the control distribution picture at the same time and disappears in sequence.
Illustratively, n balance controls marked by different arabic numerals are simultaneously displayed on the control distribution picture, and the balance controls disappear on the control distribution picture in sequence according to the sequence of the arabic numerals marked thereon.
For example: the balance control with 3 Arabic numerals marked is simultaneously displayed on the control distribution picture and comprises a balance control 1, a balance control 2 and a balance control 3, wherein the balance control 1 is used for indicating that the Arabic numerals marked on the balance control are 1. The balance control 1 disappears after being displayed on the control distribution picture for 0.1 second, the balance control 2 disappears on the control distribution picture 0.1 second after the balance control 1 disappears, and the balance control 3 disappears on the control distribution picture 0.1 second after the balance control 2 disappears. Namely: the interval duration of disappearance of different balance controls on the control distribution screen is the same. Or, the balance control 1 disappears after being displayed on the control distribution screen for 0.1 second, the balance control 2 disappears on the control distribution screen for 0.3 second after the balance control 1 disappears, and the balance control 3 disappears on the control distribution screen for 0.5 second after the balance control 2 disappears. Namely: the interval duration for different balance controls to disappear on the control distribution screen is different.
(2) And the balance controls are sequentially displayed and sequentially disappear on the control distribution picture.
Optionally, during displaying, sequentially displaying n balance controls marked by different arabic numerals on a control distribution screen; when disappearing, the balance controls disappear in order according to the sequence of the arabic numerals marked thereon.
Illustratively, the balance control 1 is first displayed on the control distribution screen (the arabic numeral used for indicating the mark on the balance control is 1), and then, the balance control 2 is displayed, and then, the balance control 3 is displayed, and so on. The order in which the balance controls disappear is the same as the order in which the balance controls are displayed, namely: the balance control disappears according to the sequence of balance control 1, balance control 2 and balance control 3.
Optionally, the time interval in which the balance controls are sequentially displayed is the same as the time interval in which the balance controls are sequentially disappeared. For example: the time interval of the sequential display of the balance control and the time interval of the sequential disappearance of the balance control are both 0.1 second. Balance control 1 shows back 0.1 second, shows balance control 2, and simultaneously, balance control 1 disappears, and 0.1 second shows back at balance control 2 shows balance control 3, and simultaneously, balance control 2 disappears etc. promptly: when the time interval of the balance controls which are sequentially displayed is the same as the time interval of the balance controls which are sequentially disappeared, the number of the balance controls displayed in the control display picture is 1, and the different balance controls are distinguished according to the Arabic numbers marked on the different balance controls and are sequentially displayed and disappeared. The positions of the different balance controls in the control display screen can be the same or different.
Optionally, the time interval for which the balance controls are sequentially displayed is different from the time interval for which the balance controls are sequentially disappeared. For example: the time interval of the sequential display of the balance controls is 0.1 second, and the time interval of the sequential disappearance of the balance controls is 0.2 second. Balance control 1 shows back 0.1 second, shows balance control 2, shows back 0.1 second at balance control 2, shows balance control 3, and simultaneously, balance control 1 disappears etc. promptly: when the time interval of the sequential display of the balance controls is different from the time interval of the sequential disappearance of the balance controls, the number of the balance controls displayed in the control display picture is not 1, and is related to the time interval of the sequential display of the balance controls and the time interval of the sequential disappearance of the balance controls. The positions of the different balance controls in the control display screen can be the same or different.
The above are merely exemplary, and the embodiments of the present application are not limited thereto.
In an alternative embodiment, the interval duration of the balance control disappearing can be a preset fixed value or a value which changes according to the difficulty of the game. For example: when the difficulty of keeping the balance of the second virtual object is higher, the interval duration of disappearance of the balance control is shorter, namely: the speed of the balance control disappearing is higher, and the balance control which does not disappear needs to be clicked more quickly to achieve the purpose of keeping the balance of the second virtual object.
(II) pointer control Picture
Step 1020, a pointer control screen is displayed.
The pointer control screen includes a pointer in a moving state and a progress bar corresponding to the moving state as balance control display elements.
Illustratively, as shown in fig. 12, the pointer control screen includes a pointer 1210 in a moving state and a progress bar 1220 corresponding to the moving state, and the area indicated by the pointer 1210 is an area corresponding to the progress bar 1220. Optionally, progress bar 1220 includes two sections. Respectively as follows: a balance area 1230 corresponding to the balance status, and a danger area 1240 in the progress bar 1220 other than the balance area 1230.
Wherein the balance area 1230 is used to indicate that the balance of the second virtual object is in a controllable state when the pointer 1210 is located in the balance area 1230; the hazardous area 1240 is opposite the equilibrium area 1230, and when the pointer 1210 is located in the hazardous area 1240, the equilibrium of the second virtual object is in an uncontrollable state, namely: the second virtual object is out of balance in the virtual scene, illustratively, when the pointer 1210 is located in the hazardous area 1240, the second virtual object cannot remain in balance and the second virtual object is eliminated.
Optionally, the balance area 1230 corresponds to a middle position of the progress bar 1220 for indicating that the left-right balance of the second virtual object needs to be controlled; the danger areas 1240 correspond to the positions of both sides of the progress bar 1220, namely: the danger area 1240 includes two parts, an area to the left of the progress bar 1220 and an area to the right of the progress bar 1220. Optionally, the progress bar 1220 in the balance area 1230 and the danger area 1240 is distinguished in different patterns, such as: the balance area 1230 appears green, the hazard area 1240 appears red, etc.
Step 1021, receiving a control operation for the pointer as a second control operation.
Alternatively, the pointer control is used to indicate that the purpose of controlling the second virtual object to keep balance, that is, the purpose of the second control operation, is achieved by letting the pointer point to the balance area. Illustratively, the control operation of the pointer includes at least one of the following.
(1) And the control operation of the pointer is realized by triggering the pointer control picture.
Alternatively, the pointing direction of the pointer is controlled to be inclined to the left by activating a designated area on the left side of the pointer control screen, and the pointing direction of the pointer is controlled to be inclined to the right by activating a designated area on the right side of the pointer control screen. The designated area includes both a designated area on the pointer control screen (for example, a left screen of the pointer control screen) and a designated control provided on the pointer control screen.
Illustratively, as shown in fig. 12, two balance buttons 1250, one on each of the left and right sides of the pointer control screen, are included in the pointer control screen. The pointer 1210 represents a yaw state of the second virtual object 1260 in the pointer control screen, and the balance button 1250 is used to assist the second virtual object 1260 to adjust the yaw state so that the second virtual object 1260 is as balanced as possible, that is: keeping the pointer 1210 as much as possible in the balance area 1230. For example: when the pointer corresponding to the second virtual object 1260 appears to be tilted to the left in the pointer control frame, the player clicks the balance button 1250 on the right side of the pointer control frame to give a right pulling force to the second virtual object 1260, which helps to reduce the tilting of the pointer 1210 corresponding to the second virtual object 1260 to the left so that the pointer 1210 is located in the balance area 1230 as much as possible.
(2) The control operation of the pointer is realized by performing balance adjustment on the external device displaying the pointer control picture.
Illustratively, the pointer control screen is a corresponding balance control screen of the x-game, and is used for controlling the second virtual object in the x-game to keep balance in the game. The x game is a game that can be run on the television terminal, and after the pointer control picture is displayed, a player can adjust the pointing direction of the pointer 1210 in the pointer control picture by controlling a direction control on a game pad connected to the television terminal, so that the pointer 1210 can point to the balance area 1230 as much as possible. For example: when the pointer corresponding to the second virtual object 1260 appears to be tilted to the right in the pointer control screen, the player can give a left pulling force to the second virtual object 1260 by clicking the left direction control of the joystick, which helps to reduce the extent to which the pointer 1210 corresponding to the second virtual object 1260 is tilted to the right, so that the pointer 1210 is located in the equilibrium area 1230 as much as possible. The above description is only exemplary, and the present invention is not limited to the above description.
In an alternative embodiment, the sensitivity of the pointer in the pointer control screen, the range of the progress bar occupied by the balance area, and other factors may vary according to the difficulty of the game. For example: when the difficulty of the second virtual object in keeping balance is high, the sensitivity of the pointer in the pointer control screen is high, that is: the tilt amplitude of the pointer corresponding to the second virtual object in the pointer control picture is difficult to adjust, and the control difficulty of the second virtual object is high. Or, when the difficulty of keeping the balance of the second virtual object is high, the range of the progress bar occupied by the balance area in the pointer control screen is small, that is: the pointer corresponding to the second virtual object has a lower probability of pointing to the balanced area, and it is more difficult to control the pointer to point to the balanced area, and accordingly, the difficulty of controlling the second virtual object to keep balanced is higher.
It should be noted that factors such as the size of the control area in the pointer control screen and the style (size, shape, etc.) of the balance button may also affect the difficulty of balance control of the second virtual object, and the difficulty of balance control of the second virtual object is not only related to factors such as the posture and position of the second virtual object, but is merely an illustrative example, and the present application is not limited thereto.
(III) moving control Picture
Step 1030, displaying a movement control screen.
The moving control picture comprises a spherical object in a moving state and a balance frame corresponding to a balance state as balance control display elements.
Schematically, as shown in fig. 13, the movement control screen includes a spherical object 1310 in a moving state and a balance frame 1320 corresponding to a balance state. Wherein the balance frame 1320 is used to indicate that when the spherical object 1310 is located within the balance frame 1320, the balance of the second virtual object is in a controllable state; the spherical object 1310 is located outside the balance box 1320, and the balance of the second virtual object is in an uncontrollable state, namely: the second virtual object is out of balance in the virtual scene, illustratively, when the spherical object 1310 is outside the balance box 1320, the second virtual object cannot be balanced and the second virtual object is eliminated.
Optionally, the balance box 1320 is represented in a regular shape, such as: the balance frame 1320 is rectangular in shape; alternatively, the balance box 1320 is represented in an irregular shape, such as: the balance box 1320 is a character shape or the like corresponding to the second virtual object. Optionally, the regions inside the balance frame 1320 and outside the balance frame 1320 are distinguished in different styles. For example: the area within the balance box 1320 is green to indicate that the balance of the second virtual object is controllable when the spherical object 1310 is within the area; the area outside the balance box 1320 is red to indicate that the balance of the second virtual object is not controllable when the spherical object 1310 is within that area.
Alternatively, the movement of the spherical object 1310 in the movement control screen is irregular. For example: the spherical object 1310 does not only include a movement pattern in a fixed direction such as upward, downward, leftward, and rightward, but also includes an irregular curved motion in the movement control screen. The above description is only exemplary, and the present invention is not limited to the above description.
And step 1031, receiving a movement control operation on the spherical object as a second control operation.
Optionally, the movement control operation on the moving rocker is used for indicating that the second virtual object is controlled to keep balance by making the spherical object located in the balance frame, i.e. the second control operation. Illustratively, the movement control operation for the spherical object includes at least one of the following.
(1) The movement operation on the movement rocker is received as a second control operation.
Optionally, the position of the spherical object on the movement control picture is controlled by triggering a movement rocker preset on the movement control picture. The position includes that the spherical object is positioned in a balance frame in the movement control picture and also includes that the spherical object is positioned outside the balance frame in the movement control picture.
Illustratively, as shown in fig. 13, a movement control frame includes a movement joystick 1330, the movement joystick 1330 is a ring joystick having a direction of "up, down, left, and right", and the movement joystick 1330 is used to assist the second virtual object 1340 to adjust the left and right swing state, so that the second virtual object 1340 is as balanced as possible. The player controls the operation of the spherical object 1310 by moving a finger within the effective range of the moving stick 1330. For example: the spherical object 1310 moves to the left and the player moves to the right within the effective range of the moving rocker 1330, i.e.: the player makes an operation opposite to the moving direction of the spherical object 1310 to control the spherical object 1310 to be located within the balance frame 1320.
The effective range of the moving stick 1330 may be an area occupied by the moving stick 1330 in the movement control frame, or a preset range outside the area occupied by the moving stick 1330 in the movement control frame. Alternatively, the moving direction and the moving distance of the spherical object 1310 may affect the moving direction and the moving distance of the joystick 1330. For example: the moving direction of the spherical object 1310 is an irregular curve inclined upwards, and the moving distance is 1 cm; accordingly, if it is desired to keep the second virtual object 1340 within the balance frame 1320, it should be moved as much as possible on the basis of the direction opposite to the moving direction of the spherical object 1310 by a distance similar to 1 cm, so as to achieve a better balance control operation for the spherical object 1310. The above description is only exemplary, and the present invention is not limited to the above description.
(2) And receiving a motion control operation of the current terminal as a second control operation.
Wherein the motion control operation is to control the movement of the spherical object based on the motion sensor of the current terminal.
Optionally, the terminal displaying the movement control picture is a terminal configured with a movement sensor, the terminal can realize movement in a space range under the control of the player, simulate and control the movement of the spherical object in the movement control picture, and control the motion operation as a second control operation for controlling the second virtual object to keep balance in the virtual scene. Illustratively, the movement control screen is a corresponding balance control screen of the x-game, and is used for controlling the second virtual object in the x-game to keep balance in the game. The x game is a game that can be executed on the mobile phone terminal, and after the mobile control screen is displayed, the player can adjust the position of the spherical object 1310 in the mobile control screen by controlling the tilt degree of the mobile phone terminal, so that the spherical object 1310 is located in the balance frame 1320 as much as possible. For example: when the spherical object 1310 is moving upward to the right in the movement control screen, the player gives an opposite force to the spherical object 1310 by controlling the mobile phone terminal to tilt downward to the left, so as to assist the spherical object 1310 to return to the original position, and at least keep the spherical object 1310 located in the balance frame 1320.
In an alternative embodiment, the range of the motion control screen occupied by the balance frame in the motion control screen, the sensitivity of the spherical object, and other factors may vary according to the difficulty of the game. For example: when the difficulty of the second virtual object in keeping balance is higher, the sensitivity of the spherical object movement in the movement control screen is higher, that is: the more inconvenient the spherical object is in the movement control screen, the higher the difficulty of controlling the second virtual object. Or, when the difficulty of the second virtual object to keep balance is higher, the range of the motion control screen occupied by the balance frame in the motion control screen is smaller, namely: the smaller the safety range within which the spherical object can move, the smaller the probability that the spherical object is located within the balance frame, and accordingly, the higher the difficulty of controlling the second virtual object to maintain balance.
It should be noted that the above is only an illustrative example, and the present invention is not limited to this.
In summary, when the target interaction event is performed, the second virtual object performs an interaction action based on the first control operation, and adjusts the interaction action according to the staged prompt provided by the first virtual object, so as to finally determine the staged execution result of the second virtual object. By the method, when the second virtual object carries out the target interaction event, the staged execution result needs to be determined based on the adjustment result of the interaction action and the action state of the second virtual object, so that the interest of the game is effectively improved.
In the method provided in the embodiment of the present application, different balance control pictures are analyzed respectively, and the difference of the second control operation under different balance control pictures is introduced. For example: when the balance control picture is a control distribution picture, the second control operation is sequential click operation on the balance control; when the balance control picture is a pointer control picture, the second control operation is a control operation performed on the pointer; when the balance control screen is a movement control screen, the second control operation is a movement control operation on the spherical object. Optionally, the difficulty of the second control operation is related to the difficulty of the second virtual object entering the equilibrium state, and the difficulty of the second control operation is higher when the difficulty of the second virtual object entering the equilibrium state is higher. By the mode, the entertainment in the game process can be effectively improved, the reality of the game is improved and the interest of the game is improved by increasing the influence factors and competitive excitation points of the game win and lose.
In an alternative embodiment, the above-mentioned interaction method between virtual objects is applied to a wooden person game, where the first virtual object is a wooden person and the second virtual object is a virtual character controlled by a player. Illustratively, as shown in fig. 14, the interaction method between the virtual objects can also be implemented as the following step 1410 to step 1480.
At step 1410, a cue tone is played, and the player is allowed to move within the cue tone.
Illustratively, after the game is started, an alert tone is played in the virtual scene, the alert tone including a form of music, beep, white noise, etc. Optionally, the player sets the alert tone to whistling before entering the game. Within the cue tones, the player can move in ways that include walking, running, creeping, etc. For example: in the wooden person game, when the wooden person does not turn around, the virtual scene plays a prompt tone, and the virtual character controlled by the player runs to a specified area (such as the area where the wooden person is located) during the playing of the prompt tone.
At step 1420, it is determined whether all players have reached the endpoint.
During the playback of the alert tone, the player controls the virtual character to move toward the specified area, and illustratively, the terminal or the server determines the position state of the player-controlled virtual character (second virtual object) and determines whether or not the virtual character has all reached the specified area. When all the virtual characters reach the designated area, the game is ended; when the virtual characters do not all reach the designated area, the game continues to be played.
And step 1430, turning the head of the wooden person after the prompt tone is played.
Illustratively, the time intervals between the playing and stopping of the alert tone are random. For example: playing a whistling sound as a warning sound when the game starts, and stopping playing after the warning sound is played for 5 seconds; alternatively, a song is played at the beginning of the game, stopped at the climax part of the song, etc., i.e.: the playing speed of the alert tone is random.
Optionally, the prompt tone is used to prompt the player to control the motion state of the virtual character, and when the prompt tone stops, the headlights start turning around, and the area of the observation sight range corresponding to the headlights changes accordingly.
In step 1440, it is determined whether the player has moved.
Illustratively, during the process of turning the head of the wooden person, the visual field range of the wooden person changes at a certain angle, and the coverage area of the visual field range changes in real time along with the change of time, and the players in the visual field of the wooden person should be in a balanced state, that is: players within the wood field of view should not move. Alternatively, the movement is used to indicate that the virtual object is in an advancing state of walking, running, creeping, or the like. Consider a virtual character moving within a wooden figure field of view as a game challenge failure, namely: the player controlling the virtual character is eliminated.
At step 1450, a player's difficulty coefficient of balance is determined.
Alternatively, when the player controls the virtual character to stop moving, there is a difference in the difficulty of performing balance control on the virtual character according to the posture of the virtual character. Illustratively, the movement of the virtual character in the game is close to a segment of circularly played 3D skeleton animation, and the time point of the animation is bound to the change rule of the balance difficulty coefficient, so that the effect of the balance difficulty and the animation frame can be corresponded to obtain the balance difficulty coefficient diagram shown in fig. 9, wherein the balance difficulty coefficients of the first posture 910 to the fifth posture 950 are increased progressively, and the control difficulty of the player on the first posture 910 of the virtual character is smaller than the control difficulty when the virtual character is in the second posture 920.
Illustratively, the pose of the stationary virtual character is compared with the pose shown in FIG. 9 to determine the equilibrium difficulty factor for each stationary virtual character that is closest to the pose, thereby determining the equilibrium difficulty factor for each stationary player in the virtual scene.
At step 1460, a balance phase operating regime is assigned.
Optionally, according to the balance difficulty coefficient, different virtual characters are assigned with different difficulties to control the operation mode of balance. Illustratively, different types of balanced pictures are displayed, for example: and the display control with lower balance difficulty coefficient distributes the picture, and the display control with higher balance difficulty coefficient displays the mobile control picture.
Or, the same type of balance picture is displayed, but balance control display elements with different difficulties are displayed on the balance picture according to the balance difficulty coefficient, for example: the balance pictures displayed for different players are control distribution pictures, but the higher the balance difficulty coefficient is, the higher the speed of appearance and disappearance of the balance control is, the lower the balance difficulty coefficient is, the lower the speed of appearance and disappearance of the balance control is, and the like.
At step 1470, it is determined whether there is a player operation failure.
Optionally, after the wooden person turns around, the virtual character located in the visual line range of the wooden person should be in a static state, and in order to ensure balance in the static state, the player completes a balance control task (task of balancing the difficulty coefficient object) within a specified time according to an operation mode for controlling balance, and maintains the balance state of the virtual character. Illustratively, specifying a time includes determining a time and not determining a time. The determined time is used for indicating the completion of the balance control task in a preset time interval, such as: the balance state of the virtual character can be maintained after the balance control task is completed within 5 seconds. The uncertain time is used for indicating the completion of the balance control task in a random time interval, such as: the virtual character A can maintain the balance state of the virtual character after completing the balance control task within 7 seconds, and the virtual character B can maintain the balance state of the virtual character after completing the balance control task within 15 seconds.
Illustratively, a player who completes a balance control task within a specified time achieves the goal of maintaining a virtual character balance state; the players who do not finish the balance control task in the specified time can not maintain the balance state of the virtual character, and the players corresponding to the virtual character which does not maintain the balance state are eliminated.
Step 1480, the person with the wooden head turns his head.
Illustratively, after a turn process of the wooden person, if the number of virtual characters on the playground is 0, then: if there is no virtual character on the game field that can be played, the game is ended.
Alternatively, if there are virtual characters that can be played on the game field, after each virtual character completes the balance control task (reaches the preset balance stage time), the wooden person "turns around" to the initial state, and when the prompt tone is played again, the player controls the virtual character to move to the designated area. And steps 1410-1480 are repeated until all players on the field pass or are eliminated from play and the game ends.
In summary, when the target interaction event is performed, the second virtual object performs an interaction action based on the first control operation, and adjusts the interaction action according to the staged prompt provided by the first virtual object, so as to finally determine the staged execution result of the second virtual object. By the method, when the second virtual object carries out the target interaction event, the interaction action in the virtual scene is required to be carried out based on the first control operation, the interaction action is required to be adjusted according to the observation sight range of the first virtual object, and the execution condition of the interaction action is displayed based on the adjustment result of the interaction action and the action state of the second virtual object, so that the stage execution result of the second virtual object is determined, the game result is prevented from being judged only by depending on a single influence factor of the visual angle turning action of the first virtual object in the game process, the influence factor of game success or failure and the competitive excitation point are increased, and the interest of the game is effectively improved.
Fig. 15 is a block diagram of an operation control apparatus according to an exemplary embodiment of the present application, and as shown in fig. 15, the apparatus includes the following components:
an object display module 1510, configured to display a first virtual object and a second virtual object in a virtual scene, where the first virtual object is used to perform a periodic prompt in a target interaction event, and the second virtual object is a virtual object participating in the target interaction event;
an operation receiving module 1520, configured to receive, at a first stage of the target interaction event, a first control operation on the second virtual object, where the first control operation is used to control the second virtual object to execute an interaction action corresponding to the target interaction event;
an action display module 1530, configured to display, at the second stage of the target interaction event, a view turning action of the first virtual object, where the view turning action is used to adjust an observation view range of the first virtual object;
a result displaying module 1540, configured to display a staged event execution result of the second virtual object in the second stage based on the execution situation of the interaction between the real-time observation view range of the first virtual object and the second virtual object in the second stage.
In an alternative embodiment, as shown in fig. 16, the result display module 1540 is further configured to determine the performance of the interactive action of the second virtual object in response to the second virtual object being within the real-time viewing line of sight of the first virtual object in the second stage; and displaying a periodic event execution result of the second virtual object in the second stage based on the interactive action execution condition.
In an optional embodiment, the apparatus is further configured to, in the second stage, receive a second control operation on the second virtual object, where the second control operation is configured to control the second virtual object to suspend the execution of the interactive action and maintain balance.
In an optional embodiment, the apparatus further comprises:
a picture display module 1550, configured to display a balance control picture in the second stage, where the balance control picture includes a balance control display element;
a control receiving module 1560, configured to receive the second control operation on the balance control screen, where the second control operation is used to control an element display result corresponding to the balance control display element, and the element display result corresponds to a balance condition of the second virtual object.
In an optional embodiment, the apparatus is further configured to display a stage failure result of the second virtual object in the second stage in response to the second virtual object continuing to perform the interactive action within the real-time viewing line of sight of the first virtual object; and in response to the second virtual object pausing the execution of the interactive action within the real-time viewing line of sight of the first virtual object, displaying a staged success result of the second virtual object at the second stage.
In an optional embodiment, the screen display module 1550 is further configured to receive a pause control operation in the second stage, where the pause control operation is configured to control the second virtual object to pause the execution of the interactive action; in response to the second virtual object not receiving the pause control operation within the real-time viewing line of sight of the first virtual object, determining a stage-wise failure result of the second virtual object at the second stage; in response to the second virtual object receiving the pause control operation within the real-time viewing gaze of the first virtual object, displaying the balance control screen based on the pause control operation.
In an optional embodiment, the apparatus is further configured to determine the balance control display element in the balance control screen based on an executed stage of the interactive action when the second virtual object suspends the execution of the interactive action, where the balance control display element has a corresponding relationship with the executed stage.
In an optional embodiment, the screen display module 1550 is further configured to display a control distribution screen, where the control distribution screen includes at least one sequentially distributed balance control as the balance control display element;
the control receiving module 1560 is further configured to receive a sequential click operation on the at least one balance control as the second control operation.
In an optional embodiment, the screen display module 1550 is further configured to display a pointer control screen, where the pointer control screen includes a pointer in a moving state and a progress bar corresponding to the moving state as the balance control display element, and the progress bar includes a balance area corresponding to a balance state;
the control receiving module 1560 is also used to receive a control operation on the pointer as the second control operation.
In an optional embodiment, the screen display module 1550 is further configured to display a movement control screen, where the movement control screen includes a spherical object in a moving state and a balance frame corresponding to the balance state as the balance control display element;
the control receiving module 1560 is also used to receive a movement control operation on the spherical object as the second control operation.
In an alternative embodiment, the control receiving module 1560 is further configured to receive a movement operation on the movement rocker as the second control operation; or, receiving, as the second control operation, a motion control operation for the current terminal, the motion control operation being used to control the movement of the spherical object based on a motion sensor of the current terminal.
It should be noted that: the interaction device between virtual objects provided in the above embodiments is only illustrated by the division of the functional modules, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above. In addition, the interaction device between virtual objects and the interaction method between virtual objects provided in the above embodiments belong to the same concept, and specific implementation processes thereof are described in detail in the method embodiments and are not described herein again.
Fig. 17 shows a block diagram of an electronic device 1700 according to an exemplary embodiment of the present application. The electronic device 1700 may be a portable mobile terminal such as: the mobile terminal comprises a smart phone, a vehicle-mounted terminal, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion Picture Experts compress Standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion Picture Experts compress Standard Audio Layer 4), a notebook computer or a desktop computer. Electronic device 1700 may also be referred to by other names such as user equipment, portable terminals, laptop terminals, desktop terminals, and the like.
Generally, electronic device 1700 includes: a processor 1701 and a memory 1702.
The processor 1701 may include one or more processing cores, such as 4-core processors, 8-core processors, and the like. The processor 1701 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1701 may also include a main processor, which is a processor for Processing data in an awake state, also called a Central Processing Unit (CPU), and a coprocessor; a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1701 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, the processor 1701 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
The memory 1702 may include one or more computer-readable storage media, which may be non-transitory. The memory 1702 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1702 is used to store at least one instruction for execution by the processor 1701 to implement the method of interaction between virtual objects provided by the method embodiments of the present application.
In some embodiments, the electronic device 1700 also includes one or more sensors. The one or more sensors include, but are not limited to: proximity sensors, gyroscope sensors, pressure sensors.
Proximity sensors, also known as distance sensors, are typically disposed on the front panel of the electronic device 1700. The proximity sensor is used to capture the distance between the user and the front of the electronic device 1700.
The gyro sensor may detect a body direction and a rotation angle of the electronic device 1700, and the gyro sensor may cooperate with the acceleration sensor to acquire a 3D motion of the user to the electronic device 1700. The processor 1701 may perform the following functions based on the data collected by the gyro sensor: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors may be disposed on the side bezel of the electronic device 1700 and/or on the lower layer of the display screen. When the pressure sensor is disposed on the side frame of the electronic device 1700, the user's grip signal on the electronic device 1700 can be detected, and the processor 1701 performs left-right hand recognition or shortcut operation according to the grip signal acquired by the pressure sensor. When the pressure sensor is disposed at the lower layer of the display screen, the processor 1701 controls the operability control on the UI interface according to the pressure operation of the user on the display screen. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
In some embodiments, electronic device 1700 also includes other component parts, and those skilled in the art will appreciate that the structure shown in FIG. 17 is not intended to be limiting of electronic device 1700, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
Embodiments of the present application also provide a computer device, which may be implemented as a terminal or a server as shown in fig. 2. The computer device comprises a processor and a memory, wherein at least one instruction, at least one program, a code set or an instruction set is stored in the memory, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor to realize the interaction method between the virtual objects provided by the method embodiments.
Embodiments of the present application further provide a computer-readable storage medium, where at least one instruction, at least one program, a code set, or a set of instructions is stored on the computer-readable storage medium, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by a processor to implement the method for interaction between virtual objects provided by the above method embodiments.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the interaction method between the virtual objects according to any of the above embodiments.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM). The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method of interaction between virtual objects, the method comprising:
displaying a first virtual object and a second virtual object in a virtual scene, wherein the first virtual object is used for carrying out periodic prompt in a target interaction event, and the second virtual object is a virtual object participating in the target interaction event;
receiving a first control operation on the second virtual object at a first stage of the target interaction event, wherein the first control operation is used for controlling the second virtual object to execute an interaction action corresponding to the target interaction event;
displaying a visual angle turning action of the first virtual object at a second stage of the target interaction event, wherein the visual angle turning action is used for adjusting the observation sight range of the first virtual object;
and displaying a periodic event execution result of the second virtual object in the second stage based on the interactive action execution condition of the real-time observation sight range of the first virtual object and the second virtual object in the second stage.
2. The method according to claim 1, wherein the displaying the execution result of the staged event of the second virtual object in the second stage based on the execution situation of the interactive action between the real-time observation view range of the first virtual object and the second virtual object in the second stage comprises:
in response to the second virtual object being within the real-time viewing line of sight of the first virtual object in the second stage, determining an interactive action execution of the second virtual object;
and displaying a periodic event execution result of the second virtual object in the second stage based on the interactive action execution condition.
3. The method of claim 2, further comprising:
and in the second stage, receiving a second control operation on the second virtual object, wherein the second control operation is used for controlling the second virtual object to suspend executing the interactive action and keep balance.
4. The method of claim 3, wherein receiving, in the second phase, a second control operation on the second virtual object comprises:
in the second stage, a balance control picture is displayed, wherein the balance control picture comprises balance control display elements;
and receiving the second control operation on the balance control screen, wherein the second control operation is used for controlling an element display result corresponding to the balance control display element, and the element display result corresponds to the balance condition of the second virtual object.
5. The method of claim 4, wherein displaying the result of the execution of the staged event of the second virtual object in the second stage based on the execution of the interactive action comprises:
in response to the second virtual object continuing to perform the interactive action within the real-time viewing line of sight of the first virtual object, displaying a stage failure result of the second virtual object at the second stage;
and in response to the second virtual object pausing the execution of the interactive action within the real-time viewing line of sight of the first virtual object, displaying a staged success result of the second virtual object at the second stage.
6. The method according to claim 5, wherein the displaying a balance control screen in the second stage comprises:
receiving a pause control operation at the second stage, wherein the pause control operation is used for controlling the second virtual object to pause the execution of the interactive action;
in response to the second virtual object not receiving the pause control operation within the real-time viewing line of sight of the first virtual object, determining a stage-wise failure result of the second virtual object at the second stage;
in response to the second virtual object receiving the pause control operation within the real-time viewing gaze of the first virtual object, displaying the balance control screen based on the pause control operation.
7. The method of claim 4, further comprising:
and determining the balance control display element in the balance control picture based on the executed stage of the interactive action when the second virtual object suspends the execution of the interactive action, wherein the balance control display element and the executed stage have a corresponding relation.
8. The method of claim 4, wherein the displaying the balance control screen comprises:
displaying a control distribution picture, wherein the control distribution picture comprises at least one balance control distributed in sequence as a balance control display element;
the receiving the second control operation on the balance control screen includes:
and receiving a sequential clicking operation on the at least one balance control as the second control operation.
9. The method of claim 4, wherein the displaying the balance control screen comprises:
displaying a pointer control picture, wherein the pointer control picture comprises a pointer in a moving state and a progress bar corresponding to the moving state as balance control display elements, and the progress bar comprises a balance area corresponding to a balance state;
the receiving the second control operation on the balance control screen includes:
receiving a control operation of the pointer as the second control operation.
10. The method of claim 4, wherein the displaying the balance control screen comprises:
displaying a movement control picture, wherein the movement control picture comprises a spherical object in a moving state and a balance frame corresponding to the balance state as balance control display elements;
the receiving the second control operation on the balance control screen includes:
receiving a movement control operation on the spherical object as the second control operation.
11. The method according to claim 10, wherein the receiving, as the second control operation, a movement control operation for the spherical object includes:
receiving a moving operation on a moving rocker as the second control operation;
alternatively, the first and second electrodes may be,
receiving, as the second control operation, a motion control operation for the current terminal, the motion control operation being for controlling movement of the spherical object based on a motion sensor of the current terminal.
12. An apparatus for interaction between virtual objects, the apparatus comprising:
the object display module is used for displaying a first virtual object and a second virtual object in a virtual scene, wherein the first virtual object is used for carrying out periodic prompt in a target interaction event, and the second virtual object is a virtual object participating in the target interaction event;
an operation receiving module, configured to receive, at a first stage of the target interaction event, a first control operation on the second virtual object, where the first control operation is used to control the second virtual object to execute an interaction action corresponding to the target interaction event;
the action display module is used for displaying the view angle turning action of the first virtual object at the second stage of the target interaction event, and the view angle turning action is used for adjusting the observation sight range of the first virtual object;
and the result display module is used for displaying a periodic event execution result of the second virtual object in the second stage based on the interactive action execution condition of the real-time observation sight range of the first virtual object and the second virtual object in the second stage.
13. A computer device comprising a processor and a memory, said memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by said processor to implement a method of interaction between virtual objects according to any one of claims 1 to 11.
14. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement a method of interaction between virtual objects according to any one of claims 1 to 11.
15. A computer program product comprising computer programs or instructions which, when executed by a processor, implement a method of interaction between virtual objects according to any one of claims 1 to 11.
CN202111531757.XA 2021-12-14 2021-12-14 Interaction method, device, equipment, medium and program product between virtual objects Pending CN114210063A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111531757.XA CN114210063A (en) 2021-12-14 2021-12-14 Interaction method, device, equipment, medium and program product between virtual objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111531757.XA CN114210063A (en) 2021-12-14 2021-12-14 Interaction method, device, equipment, medium and program product between virtual objects

Publications (1)

Publication Number Publication Date
CN114210063A true CN114210063A (en) 2022-03-22

Family

ID=80702137

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111531757.XA Pending CN114210063A (en) 2021-12-14 2021-12-14 Interaction method, device, equipment, medium and program product between virtual objects

Country Status (1)

Country Link
CN (1) CN114210063A (en)

Similar Documents

Publication Publication Date Title
US11243605B2 (en) Augmented reality video game systems
US10569176B2 (en) Video game gameplay having nuanced character movements and dynamic movement indicators
Thomas A survey of visual, mixed, and augmented reality gaming
JP2022527662A (en) Virtual object control methods, devices, equipment and computer programs
US11724191B2 (en) Network-based video game editing and modification distribution system
CN110732135B (en) Virtual scene display method and device, electronic equipment and storage medium
CN111399639B (en) Method, device and equipment for controlling motion state in virtual environment and readable medium
JP2024512582A (en) Virtual item display methods, devices, electronic devices and computer programs
JP2022552752A (en) Screen display method and device for virtual environment, and computer device and program
CN112316429A (en) Virtual object control method, device, terminal and storage medium
JP2023164687A (en) Virtual object control method and apparatus, and computer device and storage medium
WO2023071808A1 (en) Virtual scene-based graphic display method and apparatus, device, and medium
Vince Handbook of computer animation
Raffaele Virtual Reality Immersive user interface for first person view games
WO2022233125A1 (en) Information processing method and apparatus in game, electronic device, and storage medium
CN115645923A (en) Game interaction method and device, terminal equipment and computer-readable storage medium
CN114210063A (en) Interaction method, device, equipment, medium and program product between virtual objects
CN109417651B (en) Generating challenges using location-based gaming companion applications
JP2021175436A (en) Game program, game method, and terminal device
Parker Game Development Using Python
KR102649578B1 (en) Fast target selection through priority zones
Rhody Game Fiction: Playing the Interface in Prince of Persia: The Sands of Time and Asheron’s Call
US12019791B2 (en) Augmented reality video game systems
Gardiner GameMaker Cookbook
WO2023160068A1 (en) Virtual subject control method and apparatus, device, and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40072027

Country of ref document: HK