CN114130006B - Virtual prop control method, device, equipment, storage medium and program product - Google Patents

Virtual prop control method, device, equipment, storage medium and program product Download PDF

Info

Publication number
CN114130006B
CN114130006B CN202111626306.4A CN202111626306A CN114130006B CN 114130006 B CN114130006 B CN 114130006B CN 202111626306 A CN202111626306 A CN 202111626306A CN 114130006 B CN114130006 B CN 114130006B
Authority
CN
China
Prior art keywords
virtual
prop
state
shooting prop
animation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111626306.4A
Other languages
Chinese (zh)
Other versions
CN114130006A (en
Inventor
郭楚沅
赵明程
陈佐琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Publication of CN114130006A publication Critical patent/CN114130006A/en
Application granted granted Critical
Publication of CN114130006B publication Critical patent/CN114130006B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1018Calibration; Key and button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application provides a control method, a device, equipment, a storage medium and a computer program product of a virtual prop; the method comprises the following steps: in a picture of a virtual scene, presenting a virtual shooting prop held by a virtual object and a sight pattern corresponding to the virtual shooting prop; when the motion state of the virtual object in the virtual scene changes, controlling the state of the virtual shooting prop to synchronously change, and controlling the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern in the process of changing the state of the virtual shooting prop; according to the method and the device, the simulation degree of the virtual scene can be improved, the operation times of executing aiming actions are reduced, and the man-machine interaction efficiency is improved.

Description

Virtual prop control method, device, equipment, storage medium and program product
Priority description
The application requires application number 202111235891.5, application date 2021, 10 month and 22 days, and the name is: control method, device, equipment, storage medium and program product priority of virtual prop.
Technical Field
The present disclosure relates to the field of virtualization and man-machine interaction technologies, and in particular, to a method, an apparatus, a device, a storage medium, and a computer program product for controlling a virtual prop.
Background
With the development of computer technology, electronic devices can realize more abundant and visual virtual scenes. The virtual scene refers to a digital scene outlined by a computer through a digital communication technology, and a user can obtain a completely virtualized feeling (such as virtual reality) or a partially virtualized feeling (such as augmented reality) in visual, auditory and other aspects in the virtual scene, and can interact with various objects in the virtual scene or control the interaction between various objects in the virtual scene so as to obtain feedback.
In the related art, when a user controls a virtual object to change a motion state in a virtual scene, for example, from a non-steering state to a steering state, a virtual shooting prop held by the virtual object also changes with the change of the motion state. At this time, the state of the virtual shooting prop is changed, so that the aiming positions of the virtual shooting prop and the sight thereof on the picture display are different, the simulation degree of the virtual scene is difficult to ensure, the operation difficulty of aiming action is increased, and the interaction efficiency is reduced.
Disclosure of Invention
The embodiment of the application provides a control method, a device, equipment, a storage medium and a computer program product for virtual props, which can improve the simulation degree of a virtual scene, reduce the operation times of executing aiming actions and improve the human-computer interaction efficiency.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a control method of virtual props, which comprises the following steps:
in a picture of a virtual scene, presenting a virtual shooting prop held by a virtual object and a sight pattern corresponding to the virtual shooting prop;
when the motion state of the virtual object in the virtual scene changes, controlling the state of the virtual shooting prop to synchronously change, and
and in the process of changing the state of the virtual shooting prop, controlling the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
The embodiment of the application also provides a control device for the virtual prop, which comprises:
the display module is used for displaying the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop in the picture of the virtual scene;
The control module is used for controlling the state of the virtual shooting prop to synchronously change when the motion state of the virtual object in the virtual scene changes, and
and in the process of changing the state of the virtual shooting prop, controlling the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
In the above scheme, the control module is further configured to receive a steering instruction for the virtual object when a motion state of the virtual object in the virtual scene is a non-steering state;
and responding to the steering instruction, controlling the virtual object to execute steering operation along the direction indicated by the steering instruction, so that the motion state of the virtual object in the virtual scene is switched from the non-steering state to the steering state.
In the above scheme, the control module is further configured to obtain a yaw coordinate corresponding to the direction indicated by the steering instruction, and obtain a basic yaw painting corresponding to the yaw coordinate;
determining a prop deflection picture of the virtual shooting prop after the steering operation is completed based on the deflection coordinates and the basic deflection picture;
And controlling the state of the virtual shooting prop to synchronously change based on the prop deflection animation.
In the above scheme, the control module is further configured to obtain an initial animation of the virtual shooting prop before the virtual object executes the steering operation;
determining the displacement length of a shooting port of the virtual shooting prop in the process of executing the steering operation by the virtual object based on the prop deflection animation and the initial animation;
when the displacement length does not exceed a length threshold, controlling the state of the virtual shooting prop to change according to the state indicated by the prop deflection animation;
and when the displacement length exceeds a length threshold value, acquiring a target rotation angle in a target deflection range, and controlling the virtual shooting prop to rotate according to the target rotation angle so as to enable the state of the virtual shooting prop to change synchronously with the motion state of the virtual object.
In the above scheme, the control module is further configured to present an inertial motion process of the virtual shooting prop, which is caused by inertia and is adapted to the steering state or the jumping state;
and in the inertial movement process, controlling the virtual shooting prop to rebound from a changed state to a stable state, wherein the stable state enables the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
In the above scheme, the control module is further configured to obtain an actual animation of the virtual shooting prop after the state of the virtual shooting prop changes, and a standard animation of the virtual shooting prop in the stable state;
when the virtual shooting prop is determined to be in the deflection state based on the actual animation and the standard animation, acquiring a corresponding inertial animation;
and combining the actual animation, the standard animation and the inertia animation to present an inertial movement process of the virtual shooting prop, which is caused by inertia and is matched with the steering state or the jumping state.
In the above scheme, the control module is further configured to obtain an inertial resilience curve corresponding to the inertial animation;
determining position coordinates corresponding to a target time point in the process of rebounding the virtual shooting prop from the actual animation to the standard animation based on the actual animation and the inertia rebounding curve;
and displaying the inertial movement process of the virtual shooting prop from the state corresponding to the actual animation to the stable state corresponding to the standard animation based on the position coordinates corresponding to the target time point.
In the above scheme, the control module is further configured to determine a three-dimensional movement space corresponding to a shooting port of the virtual shooting prop;
And in the process of changing the state of the virtual shooting prop, controlling the deviation of the shooting port relative to the current state to be in the three-dimensional moving space so as to control the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
In the above scheme, the control module is further configured to obtain a position coordinate of a shooting port of the virtual shooting prop in the virtual scene and an offset radius corresponding to the virtual shooting prop;
and determining the corresponding ball space as a three-dimensional moving space corresponding to the shooting opening by taking the point indicated by the position coordinates as a ball center and the offset radius as a ball radius.
In the above scheme, in the process of changing the state of the virtual shooting prop, the gesture of the target part of the virtual shooting prop held by the virtual object is changed from the first gesture to the second gesture, and the control module is further configured to obtain a first position of a holding point of the virtual shooting prop held by the virtual object before the state of the virtual shooting prop is changed, and obtain a second position of the holding point corresponding to the virtual shooting prop after the state of the virtual shooting prop is changed;
Determining a bone rotation angle corresponding to the target site based on the first position and the second position;
based on the bone rotation angle, a bone position and a bone direction of the target portion are adjusted to control a posture of the target portion of the virtual object to be changed from the first posture to the second posture.
In the above scheme, the control module is further configured to determine a bone parent node and a bone child node of the target portion;
adjusting the positions of the bone sub-nodes based on the bone rotation angle, and
and driving and adjusting the position of the bone father node by adjusting the position of the bone child node so as to adjust the bone position and the bone direction of the target part.
In the above scheme, the presenting module is further configured to present, in a picture of a virtual scene, a prop control including at least one candidate virtual prop, where the at least one candidate virtual prop includes the virtual shooting prop;
controlling the virtual object to assemble the virtual shooting prop in response to the selection operation of the prop control for the virtual shooting prop, and
and presenting the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop.
The embodiment of the application also provides electronic equipment, which comprises:
a memory for storing executable instructions;
and the processor is used for realizing the control method of the virtual prop provided by the embodiment of the application when executing the executable instructions stored in the memory.
The embodiment of the application also provides a computer readable storage medium which stores executable instructions, and when the executable instructions are executed by a processor, the control method of the virtual prop provided by the embodiment of the application is realized.
The embodiment of the application also provides a computer program product, which comprises a computer program or instructions, and when the computer program or instructions are executed by a processor, the control method of the virtual prop provided by the embodiment of the application is realized.
The embodiment of the application has the following beneficial effects:
by applying the embodiment of the application, the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop are presented in the picture of the virtual scene; when the motion state of the virtual object in the virtual scene changes, the state of the virtual shooting prop is controlled to synchronously change, and in the process of changing the state of the virtual shooting prop, the object aimed by the virtual shooting prop is controlled to be consistent with the object corresponding to the position of the sight pattern. Therefore, when the motion state of the virtual object changes, as the object aimed by the virtual shooting prop is consistent with the object corresponding to the position of the sight pattern, the consistency of the aiming positions of the virtual shooting prop and the sight pattern on the picture display is ensured, the degree of reality of the virtual scene is improved, the operation times of aiming action can be reduced, the man-machine interaction efficiency is improved, and the occupation of hardware processing resources is reduced.
Drawings
Fig. 1 is a schematic architecture diagram of a virtual prop control system 100 according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device 500 implementing a method for controlling a virtual prop according to an embodiment of the present application;
fig. 3 is a schematic diagram of a man-machine interaction engine installed in a control device for a virtual prop according to an embodiment of the present application;
fig. 4 is a flow chart of a method for controlling a virtual prop according to an embodiment of the present application;
FIG. 5 is a schematic representation of a virtual shooting prop held by a virtual object and a corresponding sight pattern according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram showing a sight pattern during a state change of a virtual shooting prop according to an embodiment of the present application;
FIG. 7 is a schematic rotation diagram of a virtual shooting prop provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a virtual shooting prop in the related art;
FIG. 9 is a schematic diagram of actual changes of a virtual shooting prop when a motion state of a virtual object is changed according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a three-dimensional movement space of a virtual shooting prop provided by an embodiment of the present application;
FIG. 11 is a schematic illustration of the rotational axis points of a virtual shooting prop provided by an embodiment of the present application;
FIG. 12 is a schematic diagram of a virtual shooting prop swinging according to an inertial animation provided by an embodiment of the present application;
fig. 13 is a flow chart of a method for controlling a virtual prop according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the present application will be described in further detail with reference to the accompanying drawings, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
In the following description, the terms "first", "second", "third" and the like are merely used to distinguish similar objects and do not represent a specific ordering of the objects, it being understood that the "first", "second", "third" may be interchanged with a specific order or sequence, as permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the present application.
Before further describing embodiments of the present application in detail, the terms and expressions that are referred to in the embodiments of the present application are described, and are suitable for the following explanation.
1) And the client, the application program used for providing various services, such as a game client, an instant messaging client and a browser client, which are run in the terminal.
2) In response to a condition or state that is used to represent the condition or state upon which the performed operation depends, the performed operation or operations may be in real-time or with a set delay when the condition or state upon which it depends is satisfied; without being specifically described, there is no limitation in the execution sequence of the plurality of operations performed.
3) The virtual scene is a virtual scene that an application program displays (or provides) when running on a terminal. The virtual scene may be a simulation environment for the real world, a semi-simulation and semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present invention. For example, a virtual scene may include sky, land, sea, etc., the land may include environmental elements of a desert, city, etc., and a user may control a virtual object to move in the virtual scene.
4) Virtual objects, images of various people and objects in a virtual scene that can interact, or movable objects in a virtual scene. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the virtual scene. The virtual object may be an avatar in the virtual scene for representing a user. A virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene, occupying a portion of space in the virtual scene.
Alternatively, the virtual object may be a user Character controlled by an operation on the client, an artificial intelligence (AI, artificial Intelligence) set in the virtual scene fight by training, or a Non-user Character (NPC) set in the virtual scene interaction. Alternatively, the virtual object may be a virtual character that performs an antagonistic interaction in the virtual scene. Optionally, the number of virtual objects participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients joining the interaction.
Taking shooting games as an example, a user can control a virtual object to freely fall, glide or open a parachute to fall in the sky of the virtual scene, run, jump, crawl, bend down and move forward on land, and also can control the virtual object to swim, float or dive in the ocean. Of course, the user may also control the virtual object to move in the virtual scene with a carrier virtual prop, for example, the carrier virtual prop may be a virtual automobile, a virtual aircraft, a virtual yacht, etc.; the user may also control the virtual object to interact with other virtual objects in a antagonistic manner by attacking the class virtual prop, e.g., virtual shooting props (e.g., virtual firearms, virtual arches, etc.), virtual machine nails, virtual tanks, virtual warplanes, etc. In the embodiment of the present application, control of a virtual shooting prop will be described.
5) Scene data representing various features exhibited by objects in a virtual scene during interaction may include, for example, the location of the objects in the virtual scene. Of course, different types of features may be included depending on the type of virtual scene; for example, in a virtual scene of a game, scene data may include a time to wait when various functions are configured in the virtual scene (depending on the number of times the same function can be used in a specific time), and may also represent attribute values of various states of a game character, including, for example, a life value (also referred to as a red amount), a magic value (also referred to as a blue amount), a state value, a blood amount, and the like.
Based on the above explanation of the nouns and terms involved in the embodiments of the present application, the control system for virtual props provided in the embodiments of the present application is described below. Referring to fig. 1, fig. 1 is a schematic architecture diagram of a virtual prop control system 100 provided in an embodiment of the present application, in order to support an exemplary application, a terminal (a terminal 400-1 and a terminal 400-2 are shown in an exemplary manner) are connected to a server 200 through a network 300, where the network 300 may be a wide area network or a local area network, or a combination of the two, and a wireless or wired link is used to implement data transmission.
A terminal (e.g., terminal 400-1 and terminal 400-2) configured to send an acquisition request of scene data of a virtual scene to the server 200 based on receiving a trigger operation for entering the virtual scene from the view interface;
the server 200 is configured to receive an acquisition request of scene data, and return the scene data of the virtual scene to the terminal in response to the acquisition request;
terminals (e.g., terminal 400-1 and terminal 400-2) for receiving scene data of a virtual scene, rendering a picture of the virtual scene based on the scene data, and presenting the picture of the virtual scene at a graphical interface (graphical interface 410-1 and graphical interface 410-2 are exemplarily shown); the virtual scene picture can also present an object interaction environment, an interaction object and the like, and the content presented by the virtual scene picture is rendered based on the returned virtual scene data.
In practical applications, the server 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, and basic cloud computing services such as big data and artificial intelligence platforms. Terminals (e.g., terminal 400-1 and terminal 400-2) may be, but are not limited to, smart phones, tablet computers, notebook computers, desktop computers, smart speakers, smart televisions, smart watches, etc. Terminals, such as terminal 400-1 and terminal 400-2, and server 200 may be directly or indirectly connected through wired or wireless communication, and the present application is not limited thereto.
In practical applications, terminals (including the terminal 400-1 and the terminal 400-2) are installed and run with application programs supporting virtual scenes. The application program may be any one of a First person shooter game (FPS, first-Person Shooting game), a third person shooter game, a multiplayer online tactical competition game (MOBA, multiplayer Online Battle Arena games), a Two-dimensional (2D) game application, a Three-dimensional (3D) game application, a virtual reality application, a Three-dimensional map program, or a multiplayer warfare survival game. The application may also be a stand-alone application, such as a stand-alone 3D game program.
The virtual scene related in the embodiment of the invention can be used for simulating a two-dimensional virtual space or a three-dimensional virtual space and the like. Taking a virtual scene to simulate a three-dimensional virtual space as an example, the three-dimensional virtual space may be an open space, and the virtual scene may be used to simulate a real environment in reality, for example, the virtual scene may include sky, land, ocean, etc., and the land may include environmental elements such as a desert, a city, etc. Of course, the virtual scene may also include virtual objects, such as buildings, vehicles, and props such as weapon required for the virtual object in the virtual scene to be used for armed itself or fight with other virtual objects. The virtual scene can also be used to simulate real environments in different weather, such as sunny, rainy, foggy or night weather. The virtual object may be an avatar for representing a user, which may be in any form, such as an artificial character, an artificial animal, etc., in the virtual scene, which is not limited in the present invention. In actual implementation, a user may use a terminal (e.g., terminal 400-1) to control a virtual object to perform activities in the virtual scene, including but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing.
Taking an electronic game scene as an exemplary scene, a user can operate on the terminal in advance, after the terminal detects the operation of the user, a game configuration file of the electronic game can be downloaded, and the game configuration file can comprise an application program, interface display data or virtual scene data of the electronic game, and the like, so that the user can call the game configuration file when logging in the electronic game on the terminal, and render and display an electronic game interface. After the terminal detects the touch operation, game data corresponding to the touch operation can be determined, and rendered and displayed, wherein the game data can comprise virtual scene data, behavior data of virtual objects in the virtual scene and the like.
In practical application, a terminal (including a terminal 400-1 and a terminal 400-2) receives a trigger operation for entering a virtual scene based on a view interface, and sends a request for acquiring scene data of the virtual scene to a server 200; the server 200 receives an acquisition request of scene data, and returns the scene data of the virtual scene to the terminal in response to the acquisition request; the terminal receives scene data of the virtual scene, renders pictures of the virtual scene based on the scene data, and presents the pictures of the virtual scene;
Further, the terminal presents the virtual shooting prop (such as a virtual firearm) held by the virtual object (namely, the virtual image corresponding to the user logging in the electronic game) and the sight pattern corresponding to the virtual shooting prop in the picture of the virtual scene; when the motion state of the virtual object in the virtual scene changes, the state of the virtual shooting prop is controlled to synchronously change, and in the process of changing the state of the virtual shooting prop, the object aimed by the virtual shooting prop is controlled to be consistent with the object corresponding to the position of the sight pattern.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an electronic device 500 implementing a method for controlling a virtual prop according to an embodiment of the present application. In practical applications, the electronic device 500 may be a server or a terminal shown in fig. 1, and the electronic device 500 is taken as an example of the terminal shown in fig. 1, to describe an electronic device implementing the method for controlling a virtual prop according to the embodiment of the present application, where the electronic device 500 provided in the embodiment of the present application includes: at least one processor 510, a memory 550, at least one network interface 520, and a user interface 530. The various components in electronic device 500 are coupled together by bus system 540. It is appreciated that the bus system 540 is used to enable connected communications between these components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to the data bus. The various buses are labeled as bus system 540 in fig. 2 for clarity of illustration.
The processor 510 may be an integrated circuit chip with signal processing capabilities such as a general purpose processor, such as a microprocessor or any conventional processor, or the like, a digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like.
The user interface 530 includes one or more output devices 531 that enable presentation of media content, including one or more speakers and/or one or more visual displays. The user interface 530 also includes one or more input devices 532, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard drives, optical drives, and the like. Memory 550 may optionally include one or more storage devices physically located remote from processor 510.
Memory 550 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a random access Memory (RAM, random Access Memory). The memory 550 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 550 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
network communication module 552 is used to reach other computing devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 include: bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (USB, universal Serial Bus), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating a peripheral device and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
the input processing module 554 is configured to detect one or more user inputs or interactions from one of the one or more input devices 532 and translate the detected inputs or interactions.
In some embodiments, the control device for a virtual prop provided in the embodiments of the present application may be implemented in software, and fig. 2 shows a control device 555 for a virtual prop stored in a memory 550, which may be software in the form of a program and a plug-in, and includes the following software modules: a presentation module 5551 and a control module 5552, which are logical, and thus may be arbitrarily combined or further split according to the implemented functions, the functions of each module will be described below.
In other embodiments, the control device for a virtual prop provided in the embodiments of the present application may be implemented by combining software and hardware, and by way of example, the control device for a virtual prop provided in the embodiments of the present application may be a processor in the form of a hardware decoding processor that is programmed to perform the control method for a virtual prop provided in the embodiments of the present application, for example, the processor in the form of a hardware decoding processor may use one or more application specific integrated circuits (ASIC, application Specific Integrated Circuit), DSP, programmable logic device (PLD, programmable Logic Device), complex programmable logic device (CPLD, complex Programmable Logic Device), field programmable gate array (FPGA, field-Programmable Gate Array), or other electronic components.
In some embodiments, the terminal or the server may implement the method for controlling virtual props provided in the embodiments of the present application by running a computer program. For example, the computer program may be a native program or a software module in an operating system; a Native Application (APP), i.e. a program that needs to be installed in an operating system to run, such as a client that supports virtual scenarios, such as a game APP; the method can also be an applet, namely a program which can be run only by being downloaded into a browser environment; but also an applet that can be embedded in any APP. In general, the computer programs described above may be any form of application, module or plug-in.
In some embodiments, a man-machine interaction engine for implementing a picture display method of a virtual scene is installed in the control device 555 of a virtual prop, where the man-machine interaction engine includes a functional module, a component or an add-in for implementing a control method of a virtual prop, and fig. 3 is a schematic diagram of the man-machine interaction engine installed in the control device of a virtual prop provided in the embodiment of the present application, referring to fig. 3, taking the virtual scene as an example of a game scene, and correspondingly, the man-machine interaction engine is a game engine.
The game engine is a set of codes (instructions) designed for a machine running a certain game and capable of being recognized by the machine, and like an engine, the game engine controls the running of the game, a game program can be divided into two parts, namely a game engine and a game resource, the game resource comprises an image, a sound, an animation and the like, the game = engine (program code) +resource (image, sound, animation and the like), and the game engine sequentially calls the resources according to the requirement of the game design.
The method for controlling the virtual prop provided in the embodiment of the present application may be implemented by each module in the control device for the virtual prop shown in fig. 2 by calling a related module, component or plug-in of the game engine shown in fig. 3, and an exemplary description is given below of the module, component or plug-in included in the game engine shown in fig. 3.
As shown in fig. 3, includes: 1) The virtual camera is used for displaying game scene pictures, one game scene at least corresponds to one virtual camera, two or more game scenes can be used as game rendering windows according to actual needs, picture contents of the game world are captured and displayed for players, and the viewing angles of the players for viewing the game world, such as a first person viewing angle and a third person viewing angle, can be adjusted by setting parameters of the virtual camera.
2) Scene organization, which is used for game scene management, such as collision detection, visibility elimination, etc.; wherein, for collision detection, the collision body can be realized by a collision body, and according to actual needs, the collision body can be realized by an Axis alignment bounding box (Axis-Aligned Bounding Box, AABB) or a direction bounding box (Oriented Bounding Box, OBB); for the visibility elimination, the implementation can be based on a view body, wherein the view body is a three-dimensional frame generated according to a virtual camera and is used for cutting objects outside the visible range of the camera, and for the objects in the view body to be projected to a view plane, the objects not in the view body are discarded and not processed.
3) And the component is used for creating and editing the game terrain, such as creating the terrains in the game scenes of mountains, canyons, holes and the like.
4) An editor, an auxiliary tool in a game design, comprising:
the scene editor is used for editing the content of the game scene, such as changing the topography, customizing vegetation distribution, lamplight layout and the like;
a model editor for creating and editing a model in a game (character model in a game scene);
The special effect editor is used for editing special effects in the game picture;
and the action editor is used for defining and editing actions of the characters in the game screen.
5) The special effect component is used for manufacturing and editing the special effect of the game in the game picture, and in practical application, the special effect of particles and the texture UV animation can be adopted; the particle special effect is to combine innumerable single particles to enable the particles to be in a fixed form, and control the whole or single movement of the particles by a controller and a script to simulate the effects of water, fire, fog, gas and the like in reality; UV animation is a texture animation achieved by dynamically modifying the UV coordinates of the map.
6) Bone animation, which is realized by using built-in bones to drive objects to generate motion, can be understood as two concepts as follows:
bone: an abstract concept for controlling skin, such as human skeleton control skin;
covering: factors controlled by bones and displayed outside, such as the skin of the human body, are affected by bones.
7) Morph animation: i.e., a morphing animation, an animation achieved by adjusting the vertices of the base model.
8) And the UI control is used for realizing the control of game picture display.
9) The bottom layer algorithm, the algorithm required to be invoked for realizing the functions in the game engine, such as the graphics algorithm required by the scene organization, realizes the matrix transformation and the vector transformation required by the skeleton animation.
10 A rendering component, a component necessary for the game picture effect presentation, and a rendering component is used for converting the scene described by the three-dimensional vector into the scene described by the two-dimensional pixels, wherein the rendering component comprises model rendering and scene rendering.
11 A, searching paths, and an algorithm for searching the shortest paths, wherein the algorithm is used for path planning, path searching and graph traversing in game design.
For example, interaction between a user and a game can be realized by calling a UI control in the game engine shown in FIG. 3, a two-dimensional or three-dimensional model is manufactured by calling a Morph animation part in the game engine, after the model is manufactured, a texture map is given to the model according to different surfaces through a skeleton animation part, which is equivalent to covering the skin of a skeleton, and finally, all effects such as the model, animation, light shadow, special effects and the like are calculated in real time through a rendering component and displayed on a man-machine interaction interface. Specifically, the rendering module 5551 may render the virtual scene data by invoking the rendering component in the game engine shown in fig. 3, and then render the picture of the virtual scene, and in the picture of the virtual scene, present the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop.
The control module 5552 may determine, by invoking the underlying algorithm portion in the game engine shown in fig. 3, that when the motion state of the virtual object in the virtual scene changes, the state of the virtual shooting prop is controlled to change synchronously, and in the process of changing the state of the virtual shooting prop, the object aimed by the virtual shooting prop is controlled to be consistent with the object corresponding to the position of the sight pattern.
Based on the above description of the control system and the electronic device for the virtual prop provided in the embodiments of the present application, the control method for the virtual prop provided in the embodiments of the present application is described below. In some embodiments, the method for controlling a virtual prop provided in the embodiments of the present application may be implemented by a server or a terminal alone or in conjunction with the server and the terminal, and the method for controlling a virtual prop provided in the embodiments of the present application is described below by taking the terminal implementation as an example. Referring to fig. 4, fig. 4 is a flow chart of a method for controlling a virtual prop provided in an embodiment of the present application, where the method for controlling a virtual prop provided in the embodiment of the present application includes:
step 101: and the terminal presents the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop in the picture of the virtual scene.
When a user opens an application client on the terminal and the terminal runs the application client, the terminal presents a picture of a virtual scene (such as a shooting game scene), wherein the picture of the virtual scene is obtained by observing the virtual scene from a visual angle of a virtual object, and the virtual object is a virtual object in the virtual scene corresponding to the current user account. In the virtual scene, a user may control a virtual object to perform an action through a picture of the virtual scene. Here, the virtual object may hold a virtual shooting prop, which may be any prop used when the virtual object interacts with other virtual objects, for example, a virtual gun, a virtual bow, or the like.
In practical applications, the virtual object may possess at least one virtual shooting prop, and the user may control the virtual object to select the virtual shooting prop from the at least one virtual prop, so as to implement holding or assembling of the virtual shooting prop, so that the virtual shooting prop held by the virtual object and the sight pattern of the virtual shooting prop are presented in the picture of the virtual scene. The aiming direction of the sight pattern is the shooting direction of a virtual camera (corresponding to eyes of a user, shooting the virtual scene to obtain a scene picture, and displaying all or part of the scene picture on an object interaction interface) of the virtual scene, so that the user can be assisted in aiming operation of shooting targets through virtual shooting props.
In some embodiments, in the picture of the virtual scene, the terminal may present the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop in the following manner: in a picture of a virtual scene, presenting a prop control comprising at least one candidate virtual prop, the at least one candidate virtual prop comprising a virtual shooting prop; and responding to the selection operation of the prop control for the virtual shooting prop, controlling the virtual object to assemble the virtual shooting prop, and presenting the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop.
Here, before or during presentation of the virtual scene by the terminal, the terminal may present a property control for at least one virtual property selected by the user, the at least one candidate virtual property comprising a virtual shooting property. In practical application, the prop control is an icon corresponding to a virtual prop which can be used in the virtual scene. When a user triggers a selection operation of a property control for the presented virtual shooting property, the terminal receives and responds to the selection operation, controls the virtual object to assemble the virtual shooting property, and presents the virtual shooting property held by the virtual object and a sight pattern corresponding to the virtual shooting property.
In practical applications, when a user triggers a selection operation of a prop control for a virtual shooting prop, the prop control of the selected virtual shooting prop may be displayed in a target display style so that the display style of the prop control of the selected virtual shooting prop is different from the display style of the prop control of the unselected virtual prop, for example, the prop control of the selected virtual shooting prop is highlighted, and the prop controls of other unselected virtual props are not highlighted.
For example, referring to fig. 5, fig. 5 is a schematic representation of a virtual shooting prop held by a virtual object and a corresponding sight pattern according to an embodiment of the present application. Here, as shown in a diagram in fig. 5 a, the terminal presents prop controls 51-54 of 4 virtual props in a picture of the virtual scene; as shown in fig. 5B, when a selection operation of the property control 52 for a virtual shooting property is received, the virtual object is controlled to hold the virtual shooting property, and the virtual shooting property held by the virtual object and the sight pattern corresponding to the virtual shooting property are presented. While the prop control 52 of the selected virtual shooting prop is presented by highlighting.
Step 102: when the motion state of the virtual object in the virtual scene changes, the state of the virtual shooting prop is controlled to synchronously change, and in the process of changing the state of the virtual shooting prop, the object aimed by the virtual shooting prop is controlled to be consistent with the object corresponding to the position of the sight pattern.
Here, after the terminal displays the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop in the picture of the virtual scene, the user can control the motion state of the virtual object to realize the operations of controlling the virtual shooting prop to select and interact shooting targets based on the sight pattern. In practical applications, the motion state of the virtual object in the virtual scene may include at least one of a stationary state, a moving state (such as walking, running, etc.), a jumping state, and a steering state. The user can trigger a motion state change instruction, such as a steering instruction, a jump instruction, a movement instruction and the like, for the virtual object through a trigger operation for the man-machine interaction interface. In practical applications, the motion state change instruction may be triggered by a function item such as an operation button or an operation key in the man-machine interaction interface, for example, a jump function item may be used for a user to trigger a jump instruction for a virtual object.
When the terminal receives a motion state change instruction aiming at the virtual object, the motion state of the virtual object is controlled to change in response to the motion state change instruction, for example, when the current motion state of the virtual object is in a static state, if a jump instruction is received, the virtual object is controlled to execute jump in response to the jump instruction, so that the virtual object is switched from the static state to the jump state.
When the motion state of the virtual object in the virtual scene changes, the state of the virtual shooting prop is controlled to synchronously change because the virtual object holds the virtual shooting prop. Here, synchronization is only temporal synchronization, i.e., the state of the control virtual shooting prop changes while the state of the motion of the control virtual object changes. The state of the virtual shooting prop may also be understood as a motion state of the virtual shooting prop including at least one of a stationary state, a moving state (e.g., walking, running, etc.), a jumping state, and a steering state.
In the embodiment of the application, the terminal also controls the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern in the process of changing the state of the virtual shooting prop. Therefore, in the state change process of the virtual object and the virtual shooting prop, the position indicated by the direction of the virtual shooting prop and the position of the sight pattern can be kept consistent, the condition that the sight position is inconsistent with the shooting port pointing position of the virtual shooting prop in vision is avoided, the simulation degree of the virtual scene is improved, and the experience of a user in the virtual scene is improved.
As an example, referring to fig. 6, fig. 6 is a schematic view showing a sight pattern during a state change of a virtual shooting prop according to an embodiment of the present application. Here, the object aimed by the virtual shooting prop is M, and the object corresponding to the position of the sight pattern is also M, so that the effect of controlling the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern is achieved.
In some embodiments, the terminal may control the motion state of the virtual object to change by: when the motion state of the virtual object in the virtual scene is a non-steering state, a steering instruction aiming at the virtual object is received; in response to the steering instruction, the virtual object is controlled to perform a steering operation in a direction indicated by the steering instruction, so that a motion state of the virtual object in the virtual scene is switched from a non-steering state to a steering state.
Here, when the motion state of the virtual object in the virtual scene is a non-steering state, the user may control the virtual object to steer. Specifically, the user may trigger a steering instruction for the virtual object, for example, may be triggered by operating a function control, operating a function item, or may also be triggered by sliding a man-machine interaction interface. After the terminal receives the steering instruction, the terminal responds to the steering instruction and controls the virtual object to execute steering operation along the direction indicated by the steering instruction, so that the motion state of the virtual object in the virtual scene is switched from a non-steering state to a steering state. For example, if the direction indicated by the steering instruction is the left direction, the virtual object is controlled to perform the steering operation in the left direction.
In the process of executing the steering operation by the virtual object, the virtual shooting prop can be controlled to synchronously steer according to the rotation direction and the rotation angular velocity of the virtual object, so that the state of the virtual shooting prop and the motion state of the virtual object can be synchronously changed.
In some embodiments, when the motion state of the virtual object in the virtual scene changes, the terminal may control the state synchronization of the virtual shooting prop to change by: obtaining a deflection coordinate corresponding to the direction indicated by the steering instruction, and obtaining a basic deflection picture corresponding to the deflection coordinate; determining prop deflection pictures of the virtual shooting props after the steering operation is completed based on the deflection coordinates and the basic deflection pictures; based on the prop deflection animation, the state of the virtual shooting prop is controlled to synchronously change.
Here, after the control virtual object performs the steering operation in the direction indicated by the steering instruction, the motion state of the virtual object in the virtual scene is switched from the non-steering state to the steering state. At this time, the state of the virtual shooting prop needs to be controlled to be synchronously changed. Firstly, obtaining a deflection coordinate corresponding to a direction indicated by a steering instruction, wherein the steering instruction can be triggered by a sliding operation in a man-machine interaction interface in actual implementation, and obtaining displacement coordinates corresponding to the sliding operation, namely, a coordinate of an initial position when the sliding operation is triggered and a coordinate of an end position when the sliding operation is ended, determining the displacement coordinates corresponding to the sliding operation through the coordinate of the initial position and the coordinate of the end position, and carrying out normalization processing on the displacement coordinates to obtain the deflection coordinate corresponding to the direction indicated by the steering instruction.
Meanwhile, a basic yaw animation corresponding to the yaw coordinates is acquired, which may be preset, for example, limit yaw pictures of "up, down, left, right, left up, left down, right up, and right down" may be preset as the basic yaw pictures, and coordinate sections corresponding to [0,1], [1,0], [1, -1], [0, -1], [ -1,0], and [ -1,1] are respectively set. For example, when the yaw coordinates represent that the virtual shooting prop is yaw left, an animation that is yaw to the left limit may be obtained as a base yaw animation corresponding to the yaw coordinates.
And then, determining the prop deflection picture of the virtual shooting prop after the steering operation is finished based on the deflection coordinates and the basic deflection picture, wherein the prop deflection animation is the state of the virtual shooting prop after the steering operation is finished. Therefore, based on the prop deflection animation, the state of the virtual shooting prop is controlled to synchronously change.
In some embodiments, based on prop yaw animation, the terminal may control the state synchronization of the virtual shooting prop to change by: acquiring an initial animation of the virtual shooting prop before the virtual object executes steering operation; determining the displacement length of a shooting port of the virtual shooting prop in the process of executing steering operation by the virtual object based on the prop deflection animation and the initial animation; when the displacement length does not exceed the length threshold, controlling the state of the virtual shooting prop to change according to the state indicated by the prop deflection animation; when the displacement length exceeds the length threshold, a target rotation angle in the target deflection range is obtained, and the virtual shooting prop is controlled to rotate according to the target rotation angle, so that the state of the virtual shooting prop and the motion state of the virtual object are synchronously changed.
Here, based on the prop deflection drawing, the terminal can control the state synchronization of the virtual shooting prop to change by the following manner: firstly, obtaining an initial animation of a virtual shooting prop before a virtual object executes steering operation; the displacement length of the firing port (such as muzzle) of the virtual firing prop in the process of performing the steering operation by the virtual object is then determined based on the prop deflection animation and the initial animation. In practical application, the target deflection range of the virtual shooting prop is preset, so that whether the displacement length exceeds the length threshold corresponding to the target deflection range needs to be judged. When the displacement length does not exceed the length threshold, the prop deflection animation is characterized to enable the virtual shooting prop to be in a preset target deflection range, and the state of the virtual shooting prop is controlled to change according to the state indicated by the prop deflection animation.
When the displacement length exceeds the length threshold, the prop deflection animation is characterized to enable the virtual shooting prop to exceed the preset target deflection range. At this time, a target rotation angle in the target deflection range is obtained, and the virtual shooting prop is controlled to rotate according to the target rotation angle, so that the state of the virtual shooting prop and the motion state of the virtual object are synchronously changed.
In some embodiments, the terminal may obtain the target rotation angle in the target yaw range by: determining a route through which a shooting port of the virtual shooting prop passes in the process of executing steering operation by the virtual object, and determining an initial position which is away from the shooting port as a target position of a length threshold value on the route; and determining the rotation angle corresponding to the rotation of the shooting port from the initial position to the target position as the target rotation angle.
The terminal firstly determines a route which is passed by a shooting port of the virtual shooting prop in the process of executing steering operation by the virtual object, and then determines an initial position which is away from the shooting port as a target position of a length threshold value on the route; and then taking the holding point of the virtual object holding the virtual shooting prop as a rotation axis point, thereby determining the rotation angle corresponding to the rotation of the shooting port from the initial position to the target position according to the rotation axis point, and determining the rotation angle as the target rotation angle.
As an example, referring to fig. 7, fig. 7 is a schematic rotation diagram of a virtual shooting prop provided in an embodiment of the present application. Here, as shown in a diagram in fig. 7, a stable point P1 for indicating a shooting port of a virtual shooting prop is selected, and a grip point P0 of the virtual object holding the virtual shooting prop is selected as a rotation axis point, while a length threshold value of a target yaw range is d; as shown in fig. 7B, the actual yaw (i.e., the prop yaw animation) is from P1 to Pt, and the displacement length of P1 to Pt exceeds the length threshold d, at this time, P0 is used as the center of a circle, d is used as the radius to make a circle (i.e., the boundary line of the target yaw range), the intersection point Pt 'between the made circle and the route through which the shooting port of the virtual shooting prop passes is located, i.e., the target position which is the length threshold from the initial position of the shooting port, at this time, the rotation angle θ corresponding to the rotation of the shooting port from the initial position P1 to the target position Pt' is the target rotation angle with P0 as the rotation axis point.
In some embodiments, when the motion state of the virtual object after the change is the steering state or the jump state, after the state of the virtual shooting prop is controlled to be changed synchronously, the terminal may present the process of deflection rebound of the virtual shooting prop in the following manner: presenting an inertial movement process of the virtual shooting prop, which is caused by inertia and is adapted to a steering state or a jumping state; in the inertial movement process, the virtual shooting prop is controlled to rebound from a changed state to a stable state, and the stable state enables the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
Here, when the motion state of the virtual object after the change is the steering state or the jump state, after the state of the virtual shooting prop is controlled to be changed synchronously, in practical application, the virtual shooting prop may cause the standard animation to generate deflection when the virtual shooting prop is in a corresponding stable state due to inertia, and at the moment, the terminal can present the process of inertial rebound of the virtual shooting prop so as to simulate the deflection rebound process caused by real inertia.
In order to improve the simulation degree of the virtual scene, the process of deflection rebound of the virtual shooting prop can be presented in the following manner: and the inertial movement process of the virtual shooting prop, which is caused by inertia and is matched with the steering state or the jumping state, is presented, and in the inertial movement process, the virtual shooting prop is controlled to rebound from the changed state to the stable state, so that the object aimed by the virtual shooting prop is consistent with the object corresponding to the position of the sight pattern, and the inertial rebound process of the virtual shooting prop is simulated.
In some embodiments, the terminal may present the inertial motion process of the virtual shooting prop due to inertia, adapted to the steering state or jump state, by: acquiring an actual animation of the virtual shooting prop after the state change and a standard animation of the virtual shooting prop in a stable state; when the virtual shooting prop is determined to be in a deflection state based on the actual animation and the standard animation, acquiring a corresponding inertial animation; and combining the actual animation, the standard animation and the inertia animation to present the inertia motion process of the virtual shooting prop, which is caused by inertia and is matched with the steering state or the jumping state.
When the terminal presents the inertial motion process of the virtual shooting prop, which is caused by inertia and is matched with the steering state or the jumping state, the terminal can acquire the actual animation after the state change of the virtual shooting prop, namely, the actual animation at the time point after the state change is finished, and acquire the standard animation of the virtual shooting prop in a stable state, wherein the standard animation can be preset, namely, the animation of the virtual shooting prop in the standard posture of the virtual object holding the virtual shooting prop. And then, based on the actual animation and the standard animation, determining whether the virtual shooting prop is in a deflection state, specifically, when the actual animation is consistent with the standard animation, determining that the virtual shooting prop is not in the deflection state, and when the actual animation is inconsistent with the standard animation, determining that the virtual shooting prop is in the deflection state. When the virtual shooting prop is determined to be in the deflection state, the terminal can acquire corresponding inertial animation, so that the inertial motion process, which is caused by inertia and is matched with the steering state or the jumping state, of the virtual shooting prop is presented by combining the actual animation, the standard animation and the inertial animation.
In some embodiments, the terminal may combine the actual animation, the standard animation, and the inertial animation to present the inertial motion process of the virtual shooting prop due to inertia, which is adapted to the steering state or the jump state, in the following manner: acquiring an inertial rebound curve corresponding to the inertial animation; determining position coordinates corresponding to a target time point in the process of rebounding the virtual shooting prop from the actual animation to the standard animation based on the actual animation and the inertia rebounding curve; and displaying the inertial movement process of the virtual shooting prop from the state corresponding to the actual animation to the stable state corresponding to the standard animation based on the position coordinates corresponding to the target time point.
Here, the inertial animation is provided with a corresponding inertial rebound curve, which may be preset for the virtual shooting prop, for describing a relationship between position coordinates and time that the virtual shooting prop passes during rebound in an inertial rebound period. After the inertial rebound curve corresponding to the inertial animation is obtained, determining the position coordinates corresponding to the target time points in the process of rebounding the virtual shooting prop from the actual animation to the standard animation based on the actual animation and the inertial rebound curve, describing the actual animation in a coordinate mode when the virtual shooting prop is actually implemented, multiplying the coordinates corresponding to the actual animation by the inertial rebound curve, and taking the multiplied result as the position coordinates corresponding to each target time point in the process of rebounding the virtual shooting prop from the actual animation to the standard animation. Based on the position coordinates corresponding to the target time points, the inertial movement process of the virtual shooting prop from the state corresponding to the actual animation to the stable state corresponding to the standard animation is displayed, namely, the virtual shooting prop is controlled to sequentially pass through the position coordinates corresponding to each target time point, so that the aim of displaying the inertial movement process of the virtual shooting prop from the state corresponding to the actual animation to the stable state corresponding to the standard animation is fulfilled.
In some embodiments, during the state change of the virtual shooting prop, the terminal may control the object aimed by the virtual shooting prop to coincide with the object corresponding to the position of the sight pattern in the following manner: determining a three-dimensional moving space corresponding to a shooting port of the virtual shooting prop; in the process of changing the state of the virtual shooting prop, the deviation of the shooting port relative to the current state is controlled to be in a three-dimensional moving space so as to control the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
In some embodiments, the terminal may determine the three-dimensional movement space corresponding to the firing port of the virtual firing prop by: acquiring position coordinates of a shooting port of the virtual shooting prop in a virtual scene and an offset radius corresponding to the virtual shooting prop; and determining the corresponding ball space as a three-dimensional moving space corresponding to the shooting port by taking the point indicated by the position coordinates as the center of the ball and the offset radius as the radius of the ball.
Here, a corresponding three-dimensional movement space may be set in advance for a firing port (such as a muzzle) of the virtual firing prop. Specifically, a shooting port (namely, a muzzle) hanging point of the virtual shooting prop (namely, a point indicated by a position coordinate of the shooting port of the virtual shooting prop in a virtual scene) is taken as a sphere center, an offset radius corresponding to the virtual shooting prop is taken as a sphere radius, and a corresponding sphere space is determined to be a three-dimensional moving space of the virtual shooting prop. Therefore, in the state change process of the virtual shooting prop, the three-dimensional moving space corresponding to the shooting port of the virtual shooting prop can be determined, and then the deviation of the shooting port of the virtual shooting prop relative to the current state is controlled to be in the three-dimensional moving space, so that when the motion state of the virtual object is changed, the object aimed by the virtual shooting prop and the object corresponding to the position of the sight pattern are consistent, errors are reduced, and the animation consistency in the switching of different motion states is ensured.
In some embodiments, the virtual object holds a pose of the target portion of the virtual firing prop during a change in state of the virtual firing prop, transitioning from the first pose to the second pose; at this time, the terminal may control the switching of the gestures as follows: acquiring a first position of a holding point of the virtual shooting prop held by the virtual object before the state change of the virtual shooting prop, and acquiring a second position of the holding point corresponding to the virtual shooting prop after the state change of the virtual shooting prop; determining a bone rotation angle corresponding to the target part based on the first position and the second position; based on the bone rotation angle, the bone position and bone direction of the target portion are adjusted to control the posture of the target portion of the virtual object to be changed from the first posture to the second posture.
Here, during the state change of the virtual shooting prop, the posture of the target portion where the virtual object holds the virtual shooting prop may change, for example, change from the first posture to the second posture. At this time, in order to ensure that the displayed picture of the virtual object holding the virtual shooting prop is more simulated, the posture of the virtual object holding the virtual shooting prop can be adjusted.
In practical application, first, a first position of a holding point of a virtual shooting prop held by a virtual object before the state of the virtual shooting prop changes is obtained, and a second position of the holding point corresponding to the virtual shooting prop after the state of the virtual shooting prop changes is obtained. Then, based on the first position and the second position, determining skeleton rotation angles corresponding to target positions (such as arms) of the virtual shooting props held by the virtual objects before and after the state change of the virtual shooting props; thereby adjusting the bone position and bone direction of the target portion based on the bone rotation angle to control the posture of the target portion of the virtual object to be changed from the first posture to the second posture.
In some embodiments, the terminal may adjust the bone position and bone orientation of the target site based on the bone rotation angle by: determining a bone father node and a bone son node of the target part; based on the bone rotation angle, the positions of bone child nodes are adjusted, and the positions of bone father nodes are adjusted by adjusting the positions of the bone child nodes, so that the bone positions and the bone directions of the target parts are adjusted.
Here, the bone position and bone direction of the target site may be adjusted based on the bone rotation angle by means of inverse dynamics. In practical implementation, it is necessary to first determine a skeletal parent node and a skeletal child node of the target site, for example, the skeletal child node may correspond to a hand of a virtual object holding a virtual shooting prop, and the skeletal parent node may correspond to an arm portion of the virtual object. Then, based on the bone rotation angle, the positions of the bone child nodes are adjusted in a reverse dynamics mode, so that when the positions of the bone child nodes are adjusted, the positions of the bone father nodes can be adjusted by adjusting the positions of the bone child nodes, and the bone positions and the bone directions of the target parts are adjusted. Finally, the posture of the target part of the virtual object is controlled to be changed from the first posture to the second posture, so that the virtual object is controlled to hold the virtual shooting prop through the standard posture after the state change in the process of completing the state change of the virtual shooting prop.
In the above embodiment, the above-mentioned animation (such as the above-mentioned initial animation, actual animation, inertial animation, standard animation, basic yaw animation, prop yaw animation, etc.) may be described by coordinates. As an example, the basic yaw painting may be a preset "up, down, left, right, up left, down left, up right, and down right" 8 limit yaw animations, which may be described by corresponding coordinate intervals of [0,1], [1,0], [1, -1], [0, -1], [ -1,0] and [ -1,1], respectively.
By applying the embodiment of the application, the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop are presented in the picture of the virtual scene; when the motion state of the virtual object in the virtual scene changes, the state of the virtual shooting prop is controlled to synchronously change, and in the process of changing the state of the virtual shooting prop, the object aimed by the virtual shooting prop is controlled to be consistent with the object corresponding to the position of the sight pattern. Therefore, when the motion state of the virtual object changes, as the object aimed by the virtual shooting prop is consistent with the object corresponding to the position of the sight pattern, the consistency of the aiming positions of the virtual shooting prop and the sight pattern on the picture display is ensured, the degree of reality of the virtual scene is improved, the operation times of aiming action can be reduced, the man-machine interaction efficiency is improved, and the occupation of hardware processing resources is reduced.
An exemplary application of the embodiments of the present application in an actual application scenario will be described below taking a virtual scenario as an example of an electronic game scenario.
Next, first, terms related to embodiments of the present application will be explained, including:
1) First person shooter game (FPS): shooting games are played with a subjective view of the player so that the player no longer plays by manipulating virtual characters alone, but experiences the visual impact of the game on his own.
2) ADS: is an observation device usually made of metal. When the scope is not assembled, it can be used to position the object in a straight line, assisting the object in aiming at a specific target, where the angle of the camera is moved behind the weapon scope. This will make the virtual shooting prop more accurate and may provide a certain scaling to provide higher availability over a longer range. When assembling a scope, there is usually a scale or specially designed line of sight, mainly for magnifying the image of the target onto the retina so that the aiming becomes easier and more accurate. The magnification is proportional to the diameter of the objective, and a larger aperture of the objective can make the image clearer and brighter, which may cause a reduction in the field of view when the magnification is high.
3) Waist-shot: the virtual shooting device is characterized in that in a state of normally holding the virtual shooting prop, the shooting button is directly clicked to shoot without opening a mirror, the virtual shooting prop is held by waist or chest assistance, and the waist strength is used for shooting against recoil.
4) Quasi-star: a portion of the aiming device of the virtual firing prop is typically located at the upper end of the firing port (i.e., muzzle). When the sight pattern is moved onto the target, it is detected by the hit collision body of the target, and the bullet is automatically launched without requiring any other operation by the player.
5) FOV: i.e., field view, the field of view of a camera, in degrees, is the angular range over which a camera can receive images in a typical environment, and may also be commonly referred to as the field of view.
6) Breathing animation: the simulated player is in an animated state when holding the virtual shooting prop without any other action, and is slightly hand-swung in appearance.
7) Firing animation: the animation state that the player shoots by holding the virtual shooting prop is simulated, and at the moment, the shooting animation shakes due to the instant backward force generated when the virtual shooting prop shoots a bullet, so that the sight of the virtual shooting prop is influenced to generate offset.
8) Reverse kinetics (Inverse Kinematics, IK): the method is that the positions of bone child nodes are determined first, then the positions of n-level bone father nodes on the bone chain where the bone child nodes are located are deduced by inverse calculation, so that the whole bone chain is determined, and particularly the shape of the whole bone is inversely calculated according to the final positions and angles of certain bone child nodes.
In the related art, in an FPS game having various animation expressions and capable of changing different virtual shooting props (such as virtual firearms), corresponding animation expressions are generally played under different motion states (such as turning, moving or jumping) of a virtual object according to the different virtual shooting props held, and usually the animation expressions of the different virtual shooting props need different art animation resource support, and meanwhile, the animation expressions of the virtual shooting props are not distinguished under different motion states, such as turning animation, jumping animation and moving animation, and the animation expressions are not distinguished when the turning amplitude is different.
Fig. 8 is a schematic diagram of a virtual shooting prop in the related art, as shown in fig. 8. Here, the figure A in FIG. 8 shows the idle animation pose of a normal gun; FIG. 8B is a diagram showing the idle animation position of the gun during jump, wherein the muzzle is biased upward relative to the sight; FIG. 8C is a diagram showing the idle animation position of the gun holding during rightward rotation, wherein the muzzle is biased to the right relative to the sight; fig. 8D shows the idle animation posture of the gun holding when the gun is rotated to the left, and the muzzle is biased to the left with respect to the sight.
Referring to fig. 9, fig. 9 is a schematic diagram of actual change of a virtual shooting prop when a motion state of a virtual object is changed according to an embodiment of the present application. Here, as shown in fig. 9 a and B, when the player turns the lens slowly or rapidly (i.e., controls the virtual object to turn), the virtual shooting prop turns synchronously, and the muzzle animation amplitude of the slowly turned lens appears smaller than that of the rapidly turned lens; as shown in fig. 9, panels C and D, the muzzle animation amplitude performance is smaller when the player moves than when the player jumps.
From the above, the animation amplitude can not be distinguished under the condition of different steering amplitudes of the virtual shooting prop: namely, when the player rotates the lens slowly, in order to simulate the inertia performance of the animation, the gun-holding animation can be slightly swung, when the rotation of the lens is stopped, the firearm can stop swinging and enter the normal breathing state animation, when the player rotates the lens rapidly, the steering speed of the lens becomes fast, the swinging animation of the firearm can not simulate the inertia change, the swinging animation is consistent with the animation performance of the lens which rotates slowly, and when the rotation of the lens is stopped, the firearm can stop swinging and enter the normal breathing state animation by the same principle, so that the influence of steering amplitude and speed on the animation is difficult to distinguish, and the problem that the animation performance and the lens steering are not matched to influence the simulated performance is caused. And when the player is in different motion states, such as moving, jumping and steering, the direction of the muzzle of the firearm can be changed along with the different states, at the moment, the problem of inaccurate pre-aiming position of the muzzle before firing is caused by inconsistent aiming position of the sight and muzzle, and in addition, firing animation can be played when firing, so that the problem of discontinuous engagement of firing animation before and after the motion state is changed is caused, and the problem of visual inconsistency and poor performance of the firing animation is caused.
When the virtual object holds different virtual shooting props, the muzzle performance during movement and jump needs to be independently customized with animation resources, when the virtual shooting props are changed in numerical value or new virtual shooting props are manufactured, the original animation resources cannot meet the requirements, the virtual shooting props are not matched with the animation, and the hit rate is influenced.
If a player uses different virtual shooting props, different animation expressions are required to be displayed according to the different virtual shooting props, corresponding art animation resources are required to be supported for the different virtual shooting props, and at the moment, animation resources in different states are required to be manufactured for each involved virtual shooting prop, so that the requirement on the yield of resource manufacturing becomes high, the dependence on the resource is large, and meanwhile, the adaptation and the modification are not easy.
Based on the above, the embodiment of the application provides a control method for virtual props. Referring to fig. 10, fig. 10 is a schematic diagram of a three-dimensional movement space of a virtual shooting prop provided in this embodiment of the present application, where a shooting port (i.e., a muzzle) hanging point of the virtual shooting prop is taken as a sphere center, an offset radius corresponding to the virtual shooting prop is taken as a sphere radius, and a spherical movement space corresponding to the virtual shooting prop is determined, so as to control the virtual shooting prop to move in the spherical movement space, thereby ensuring that when a motion state of a virtual object changes, an orientation of the shooting port of the virtual shooting prop is consistent with a pre-aiming position (i.e., a position corresponding to a sight pattern), reducing errors, and ensuring animation consistency when different motion states are switched.
Meanwhile, the turning radius of the shooting ports of different virtual shooting props and the turning axial points can be independently configured, so that the animation expression of different weapons is ensured, and animation resources are not required to be independently customized for each type of virtual shooting props, thereby improving productivity and expression. When the player controls the virtual shooting prop to turn, the performance of the turning animation is determined according to the turning amplitude and speed, wherein the performance comprises the turning inertia performance, namely the turning amplitude, the turning frequency and the round trip period accord with the inertia performance, and different virtual shooting props can be independently configured to simulate the physical inertia performance and do not depend on art resources, so that the simulation performance and the productivity can be ensured.
Therefore, under the switching of different motion states of the virtual object, the animation of the virtual shooting prop in the embodiment of the application is adapted to the animation expression of different virtual shooting props, and the problems of accurate directivity and animation simulation expression of the shooting port of the virtual shooting prop are solved.
In the embodiment of the application, different three-dimensional movement spaces can be set for different states for the shooting ports (i.e., muzzle) of the virtual shooting prop. When the player is in the waist shooting state, the three-dimensional moving space of the shooting port of the virtual shooting prop can be determined by taking the muzzle hanging point as the sphere center and the spherical radius as R1, and at the moment, the rotating radius of the muzzle is fixed when the player turns; when a player is in a jumping state, the three-dimensional movement space of the shooting mouth of the virtual shooting prop can be determined by taking the muzzle hanging point as the sphere center and the sphere radius as R2, and at the moment, the situation that the muzzle orientation is inaccurate due to overlarge steering in the radius of R1 or R2 can not occur due to control, and meanwhile, different firearms and different states can be independently configured.
In the embodiment of the present application, referring to fig. 11, fig. 11 is a schematic diagram of a rotation axis point of a virtual shooting prop provided in the embodiment of the present application. Here, for the same firearm animation, the rotation amplitude of the whole firearm body can be controlled when the rotation axis point position of the firearm steering is changed. As shown in a graph a in fig. 11, when the rotation axis point is close to the muzzle, the muzzle rotation amplitude is smaller, and the butt part rotation amplitude is larger; as shown in the diagram B in fig. 11, when the rotation axis point is positioned at the center point of the gun, the rotation amplitude of the gun muzzle is increased, and the rotation amplitude of the gun stock is reduced.
In the embodiment of the application, when a player turns, i.e. rotates a lens, in order to match player input, the amplitude of the firearm animation swing becomes larger when the sliding amplitude and the speed are larger, and when the player input stops, i.e. turns to stop, in order to simulate the performance of the firearm animation, the inertial performance of the firearm animation is simulated according to the rotation speed and the amplitude. At this time, the animation inertia curve is taken over, and the amplitude and the speed of the swing gradually decay with time, see fig. 12, and fig. 12 is a schematic diagram of the virtual shooting prop swinging according to the inertia animation according to the embodiment of the present application. And simultaneously, the breathing animation state is returned after the configurable time, and the scheme does not need art resource support and supports independent adaptation of various types of virtual shooting props.
In practical application, the single adaptation of the same virtual shooting prop in different states, such as a jumping state and a moving state, can be supported; meanwhile, different virtual shooting props are supported to be arranged in a distinguishing mode, such as a sniper gun and an assault rifle, and at the moment, the whole performance is controlled through the set animation deflection range to replace the support of art animation resources.
Next, a method for controlling the virtual prop provided in the embodiment of the present application will be described in detail. Referring to fig. 13, fig. 13 is a flow chart of a method for controlling a virtual prop provided in an embodiment of the present application, including:
step 201: entering a game scene;
step 202: acquiring user input;
step 203: converting the user input into a normalized screen xy coordinate value as a deflection coordinate of the current input;
step 204: if the yaw coordinate is equal to 0, executing step 205, otherwise executing step 210;
step 205: judging whether the virtual shooting prop is in a deflection state currently, if so, executing step 206, and if not, keeping the breathing animation;
step 206: executing muzzle rebound logic (i.e., the above-described freewheeling animation) of the virtual shooting prop;
step 207: resetting the rebound curve time and recording the xy coordinate value of the current deflection state;
Step 208: reading a rebound curve (namely the inertia rebound curve), and calculating xy coordinates of each position passing through in the rebound process according to the deflection state of the rebound curve member;
step 209: setting a deflection animation until the rebound curve time is over;
step 210: executing starting deflection muzzle logic;
step 211: setting a deflection animation and correcting rotation.
In practical applications, the specific implementation process of the step 211 may be shown in the lower left corner step 211 of fig. 13, including:
step 301: taking out the corresponding action from the deflection swinging picture;
step 302: mixing corresponding actions according to screen xy coordinates;
step 303: setting a virtual shooting prop deflection;
step 304: calculating the displacement of the muzzle of the current virtual shooting prop and the muzzle of the initial virtual shooting prop;
step 305: whether the displacement is greater than the set distance d, if yes, go to step 306, if no, go to step 308;
step 306: taking a point Pt' with the distance d from the initial muzzle to the current position;
step 307: rotating the muzzle to point at point Pt';
here, pt' can be seen in fig. 7.
Step 308: rendering the skeletal data.
The following is a detailed description. First, in order to realize the swing of the virtual shooting prop when the virtual object is seen from left to right, an upper, lower, left, right, upper left, lower left, upper right, lower right limit swing single frame animation is provided for defining the rotation and the offset of the virtual shooting prop in 8 direction intervals (corresponding numerical intervals are [0,1], [1,0], [1, -1], [0, -1], [ -1,0], [ -1,1 ]) under the limit swing (alpha=1).
Secondly, when a player rotates the lens, the swing offset can be interpolated to a specified value in a specified two-dimensional direction to obtain an interpolated vector value alpha, wherein the vector value alpha is initially [0,0], the limit is [ -1, -1] to [1,1], and a single-frame animation of the swing given by an x axis and a y axis is superimposed on the virtual shooting prop according to the swing offset, and the superposition amplitude is alpha.
Thirdly, when the player stops rotating the lens, the alpha value starts EaseInOut fade-in fade-out, so that the deflection of the virtual shooting prop is recovered to be 0,0 along with time, and the starting EaseInOut is to rebound the muzzle through the inertia rebound curve so as to display the rebound process of the muzzle from the deflection state to the stable state through inertia motion animation. At this time, the virtual shooting prop is also pointed back to the basic direction (i.e. in a stable state), and the picture will show a rebound effect.
Fourth, referring to fig. 7, as shown in fig. 7 a, a stable point P1 for indicating a shooting port of a virtual shooting prop is selected, and a holding point P0 of the virtual object holding the virtual shooting prop is selected as a rotation axis point, and a length threshold value of a target yaw range is d; as shown in fig. 7B, the actual yaw (i.e., the prop yaw animation) is from P1 to Pt, and the displacement length of P1 to Pt exceeds the length threshold d, at this time, P0 is used as the center of a circle, d is used as the radius to make a circle (i.e., the boundary line of the target yaw range), the intersection point Pt 'between the made circle and the route through which the shooting port of the virtual shooting prop passes is located, i.e., the target position which is the length threshold from the initial position of the shooting port, at this time, the rotation angle θ corresponding to the rotation of the shooting port from the initial position P1 to the target position Pt' is the target rotation angle with P0 as the rotation axis point.
Fifth, during system activation, the muzzle point Pt calculated by the animation system and the deflection system is obtained, and then Pt' =p1+normal (Pt-P1) x d is taken when the distance between Pt and P1 during operation is greater than d. Their rotation θ is then calculated from the two vectors of P0' to Pt ', P0' to P1. And finally, superposing theta rotation on the holding point of the virtual shooting prop, so that the muzzle orientation of the virtual shooting prop can be stabilized within the range of the distance length d from the original position.
Sixth, eventually the player's hand needs to be fitted to the new virtual shooting prop rotation. Here the original position offset of the hand from the firearm and the original rotation are recorded. And then reversely deducing the position and rotation of the hand according to the position of the new virtual shooting prop, and deducing the elbow position through the upper arm, the hand position and the elbow polar vector by using an IK algorithm. The hand is again held in the correct manner on the virtual shooting support.
First, the above embodiments of the present application add animation coefficient support to the animation of virtual shooting props in different motion states, which affects the animation amplitude in different states, such as moving state, turning state, firing state, and jumping state. Secondly, on the premise of not changing animation resources, a set of animation resources are ensured to be adapted to shooting ports and gesture expressions of different virtual shooting props, wherein the shooting ports and gesture expressions are used for holding animations. Third, the problem of representing the inertia swing frequency and swing amplitude of different virtual shooting props under the condition that the animation resources of the virtual shooting props and the character animation resources are not changed is solved.
Continuing with the description below of an exemplary architecture implemented as a software module for virtual prop control 555 provided in embodiments of the present application, in some embodiments, as shown in fig. 2, the software modules stored in virtual prop control 555 of memory 550 may include:
a presenting module 5551, configured to present, in a picture of a virtual scene, a virtual shooting prop held by a virtual object, and a sight pattern corresponding to the virtual shooting prop;
a control module 5552 for controlling the state of the virtual shooting prop to synchronously change when the motion state of the virtual object in the virtual scene changes, and
and in the process of changing the state of the virtual shooting prop, controlling the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
In some embodiments, the control module 5552 is further configured to receive a steering instruction for the virtual object when the motion state of the virtual object in the virtual scene is a non-steering state;
and responding to the steering instruction, controlling the virtual object to execute steering operation along the direction indicated by the steering instruction, so that the motion state of the virtual object in the virtual scene is switched from the non-steering state to the steering state.
In some embodiments, the control module 5552 is further configured to obtain a yaw coordinate corresponding to the direction indicated by the steering instruction, and obtain a basic yaw painting corresponding to the yaw coordinate;
determining a prop deflection picture of the virtual shooting prop after the steering operation is completed based on the deflection coordinates and the basic deflection picture;
and controlling the state of the virtual shooting prop to synchronously change based on the prop deflection animation.
In some embodiments, the control module 5552 is further configured to obtain an initial animation of the virtual shooting prop before the virtual object performs the steering operation;
determining the displacement length of a shooting port of the virtual shooting prop in the process of executing the steering operation by the virtual object based on the prop deflection animation and the initial animation;
when the displacement length does not exceed a length threshold, controlling the state of the virtual shooting prop to change according to the state indicated by the prop deflection animation;
and when the displacement length exceeds a length threshold value, acquiring a target rotation angle in a target deflection range, and controlling the virtual shooting prop to rotate according to the target rotation angle so as to enable the state of the virtual shooting prop to change synchronously with the motion state of the virtual object.
In some embodiments, the control module 5552 is further configured to present an inertial movement process of the virtual shooting prop due to inertia, which is adapted to the steering state or jump state;
and in the inertial movement process, controlling the virtual shooting prop to rebound from a changed state to a stable state, wherein the stable state enables the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
In some embodiments, the control module 5552 is further configured to obtain an actual animation of the virtual shooting prop after the state change, and a standard animation of the virtual shooting prop when in the stable state;
when the virtual shooting prop is determined to be in the deflection state based on the actual animation and the standard animation, acquiring a corresponding inertial animation;
and combining the actual animation, the standard animation and the inertia animation to present an inertial movement process of the virtual shooting prop, which is caused by inertia and is matched with the steering state or the jumping state.
In some embodiments, the control module 5552 is further configured to obtain an inertial resilience curve corresponding to the inertial animation;
Determining position coordinates corresponding to a target time point in the process of rebounding the virtual shooting prop from the actual animation to the standard animation based on the actual animation and the inertia rebounding curve;
and displaying the inertial movement process of the virtual shooting prop from the state corresponding to the actual animation to the stable state corresponding to the standard animation based on the position coordinates corresponding to the target time point.
In some embodiments, the control module 5552 is further configured to determine a three-dimensional movement space corresponding to a shooting port of the virtual shooting prop;
and in the process of changing the state of the virtual shooting prop, controlling the deviation of the shooting port relative to the current state to be in the three-dimensional moving space so as to control the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
In some embodiments, the control module 5552 is further configured to obtain a position coordinate of a shooting port of the virtual shooting prop in the virtual scene and an offset radius corresponding to the virtual shooting prop;
and determining the corresponding ball space as a three-dimensional moving space corresponding to the shooting opening by taking the point indicated by the position coordinates as a ball center and the offset radius as a ball radius.
In some embodiments, during the state change of the virtual shooting prop, the pose of the target portion of the virtual object holding the virtual shooting prop is changed from the first pose to the second pose, and the control module 5552 is further configured to obtain a first position of a holding point of the virtual object holding the virtual shooting prop before the state change of the virtual shooting prop, and obtain a second position of the holding point corresponding to the virtual shooting prop after the state change of the virtual shooting prop;
determining a bone rotation angle corresponding to the target site based on the first position and the second position;
based on the bone rotation angle, a bone position and a bone direction of the target portion are adjusted to control a posture of the target portion of the virtual object to be changed from the first posture to the second posture.
In some embodiments, the control module 5552 is further configured to determine a skeletal parent node and a skeletal child node of the target site;
adjusting the positions of the bone sub-nodes based on the bone rotation angle, and
and driving and adjusting the position of the bone father node by adjusting the position of the bone child node so as to adjust the bone position and the bone direction of the target part.
In some embodiments, the presenting module 5551 is further configured to present, in a screen of a virtual scene, a prop control including at least one candidate virtual prop, where the at least one candidate virtual prop includes the virtual shooting prop;
controlling the virtual object to assemble the virtual shooting prop in response to the selection operation of the prop control for the virtual shooting prop, and
and presenting the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop.
By applying the embodiment of the application, the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop are presented in the picture of the virtual scene; when the motion state of the virtual object in the virtual scene changes, the state of the virtual shooting prop is controlled to synchronously change, and in the process of changing the state of the virtual shooting prop, the object aimed by the virtual shooting prop is controlled to be consistent with the object corresponding to the position of the sight pattern. Therefore, when the motion state of the virtual object changes, as the object aimed by the virtual shooting prop is consistent with the object corresponding to the position of the sight pattern, the consistency of the aiming positions of the virtual shooting prop and the sight pattern on the picture display is ensured, the degree of reality of the virtual scene is improved, the operation times of aiming action can be reduced, the man-machine interaction efficiency is improved, and the occupation of hardware processing resources is reduced.
The embodiment of the application also provides electronic equipment, which comprises:
a memory for storing executable instructions;
and the processor is used for realizing the control method of the virtual prop provided by the embodiment of the application when executing the executable instructions stored in the memory.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the computer device executes the control method of the virtual prop provided by the embodiment of the application.
The embodiment of the application also provides a computer readable storage medium which stores executable instructions, and when the executable instructions are executed by a processor, the control method of the virtual prop provided by the embodiment of the application is realized.
In some embodiments, the computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of programs, software modules, scripts, or code, written in any form of programming language (including compiled or interpreted languages, or declarative or procedural languages), and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, the executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, for example, in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices located at one site or, alternatively, distributed across multiple sites and interconnected by a communication network.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modifications, equivalent substitutions, improvements, etc. that are within the spirit and scope of the present application are intended to be included within the scope of the present application.

Claims (11)

1. A method for controlling a virtual prop, the method comprising:
in a picture of a virtual scene, presenting a virtual shooting prop held by a virtual object and a sight pattern corresponding to the virtual shooting prop;
when the motion state of the virtual object in the virtual scene changes, controlling the state of the virtual shooting prop to synchronously change, and determining a three-dimensional moving space corresponding to a shooting port of the virtual shooting prop;
in the process of changing the state of the virtual shooting prop, controlling the deviation of the shooting port relative to the current state to be in the three-dimensional moving space so as to control the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern;
when the motion state of the virtual object after the change is a steering state or a jumping state, acquiring an actual animation of the virtual shooting prop after the state change and a standard animation of the virtual shooting prop in a stable state;
When the virtual shooting prop is determined to be in a deflection state based on the actual animation and the standard animation, determining a position coordinate corresponding to a target time point in the process of rebounding the virtual shooting prop from the actual animation to the standard animation based on the actual animation and an inertial rebounding curve corresponding to the deflection state;
and based on the position coordinates corresponding to the target time points, presenting the inertial movement process of the virtual shooting prop from the state after the state change to rebound to the stable state, wherein the stable state enables the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
2. The method of claim 1, wherein prior to said controlling the state synchronization of the virtual shooting pot to change, the method further comprises:
when the motion state of the virtual object in the virtual scene is a non-steering state, receiving a steering instruction aiming at the virtual object;
and responding to the steering instruction, controlling the virtual object to execute steering operation along the direction indicated by the steering instruction, so that the motion state of the virtual object in the virtual scene is switched from the non-steering state to the steering state.
3. The method of claim 2, wherein controlling the state synchronization of the virtual shooting prop to change when the motion state of the virtual object in the virtual scene changes comprises:
obtaining a deflection coordinate corresponding to the direction indicated by the steering instruction, and obtaining a basic deflection picture corresponding to the deflection coordinate;
determining a prop deflection picture of the virtual shooting prop after the steering operation is completed based on the deflection coordinates and the basic deflection picture;
and controlling the state of the virtual shooting prop to synchronously change based on the prop deflection animation.
4. The method of claim 3, wherein controlling the state synchronization of the virtual shooting prop to change based on the prop yaw animation comprises:
acquiring an initial animation of the virtual shooting prop before the virtual object executes the steering operation;
determining the displacement length of a shooting port of the virtual shooting prop in the process of executing the steering operation by the virtual object based on the prop deflection animation and the initial animation;
when the displacement length does not exceed a length threshold, controlling the state of the virtual shooting prop to change according to the state indicated by the prop deflection animation;
And when the displacement length exceeds a length threshold value, acquiring a target rotation angle in a target deflection range, and controlling the virtual shooting prop to rotate according to the target rotation angle so as to enable the state of the virtual shooting prop to change synchronously with the motion state of the virtual object.
5. The method of claim 1, wherein the determining a three-dimensional movement space corresponding to a firing port of the virtual firing prop comprises:
acquiring position coordinates of a shooting port of the virtual shooting prop in the virtual scene and an offset radius corresponding to the virtual shooting prop;
and determining the corresponding ball space as a three-dimensional moving space corresponding to the shooting opening by taking the point indicated by the position coordinates as a ball center and the offset radius as a ball radius.
6. The method of claim 1, wherein the virtual object holds a pose of a target site of the virtual firing prop during a change in state of the virtual firing prop, the method being transformed from a first pose to a second pose, the method further comprising:
acquiring a first position of a holding point of the virtual shooting prop held by the virtual object before the state change of the virtual shooting prop, and acquiring a second position of the holding point corresponding to the virtual shooting prop after the state change of the virtual shooting prop;
Determining a bone rotation angle corresponding to the target site based on the first position and the second position;
based on the bone rotation angle, a bone position and a bone direction of the target portion are adjusted to control a posture of the target portion of the virtual object to be changed from the first posture to the second posture.
7. The method of claim 6, wherein adjusting the bone position and bone orientation of the target site based on the bone rotation angle comprises:
determining a bone parent node and a bone child node of the target site;
adjusting the positions of the bone sub-nodes based on the bone rotation angle, and
and driving and adjusting the position of the bone father node by adjusting the position of the bone child node so as to adjust the bone position and the bone direction of the target part.
8. The method of claim 1, wherein presenting, in the frame of the virtual scene, a virtual shooting prop held by the virtual object and a sight pattern corresponding to the virtual shooting prop, comprises:
in a picture of a virtual scene, presenting a prop control comprising at least one candidate virtual prop, the at least one candidate virtual prop comprising the virtual shooting prop;
Controlling the virtual object to assemble the virtual shooting prop in response to the selection operation of the prop control for the virtual shooting prop, and
and presenting the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop.
9. A control device for a virtual prop, the device comprising:
the display module is used for displaying the virtual shooting prop held by the virtual object and the sight pattern corresponding to the virtual shooting prop in the picture of the virtual scene;
the control module is used for controlling the state of the virtual shooting prop to synchronously change when the motion state of the virtual object in the virtual scene changes, and determining a three-dimensional moving space corresponding to a shooting port of the virtual shooting prop; in the process of changing the state of the virtual shooting prop, controlling the deviation of the shooting port relative to the current state to be in the three-dimensional moving space so as to control the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern;
the control module is further used for acquiring an actual animation of the virtual shooting prop after the state change and a standard animation of the virtual shooting prop in a stable state when the motion state of the virtual object after the change is a steering state or a jumping state;
When the virtual shooting prop is determined to be in a deflection state based on the actual animation and the standard animation, determining a position coordinate corresponding to a target time point in the process of rebounding the virtual shooting prop from the actual animation to the standard animation based on the actual animation and an inertial rebounding curve corresponding to the deflection state;
and based on the position coordinates corresponding to the target time points, presenting the inertial movement process of the virtual shooting prop from the state after the state change to rebound to the stable state, wherein the stable state enables the object aimed by the virtual shooting prop to be consistent with the object corresponding to the position of the sight pattern.
10. An electronic device, the electronic device comprising:
a memory for storing executable instructions;
a processor for implementing the method of controlling a virtual prop of any one of claims 1 to 8 when executing executable instructions stored in the memory.
11. A computer readable storage medium storing executable instructions which, when executed by a processor, implement the method of controlling a virtual prop of any one of claims 1 to 8.
CN202111626306.4A 2021-10-22 2021-12-28 Virtual prop control method, device, equipment, storage medium and program product Active CN114130006B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111235891 2021-10-22
CN2021112358915 2021-10-22

Publications (2)

Publication Number Publication Date
CN114130006A CN114130006A (en) 2022-03-04
CN114130006B true CN114130006B (en) 2023-07-25

Family

ID=80383582

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111626306.4A Active CN114130006B (en) 2021-10-22 2021-12-28 Virtual prop control method, device, equipment, storage medium and program product

Country Status (1)

Country Link
CN (1) CN114130006B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117563230B (en) * 2024-01-17 2024-04-05 腾讯科技(深圳)有限公司 Data processing method and related equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140023641A (en) * 2012-08-16 2014-02-27 (주)네오위즈게임즈 Online shooting gamemethod andonline shooting game server performing the same
CN111035928A (en) * 2019-11-19 2020-04-21 腾讯科技(深圳)有限公司 Method and device for displaying travel route, storage medium and electronic device
CN111097166A (en) * 2019-12-12 2020-05-05 腾讯科技(深圳)有限公司 Control method and device of virtual operation object, storage medium and electronic device
CN111408132A (en) * 2020-02-17 2020-07-14 网易(杭州)网络有限公司 Game picture display method, device, equipment and storage medium
CN112076473A (en) * 2020-09-11 2020-12-15 腾讯科技(深圳)有限公司 Control method and device of virtual prop, electronic equipment and storage medium
CN113499590A (en) * 2021-08-04 2021-10-15 网易(杭州)网络有限公司 Method and device for controlling shooting in game, electronic equipment and readable medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140023641A (en) * 2012-08-16 2014-02-27 (주)네오위즈게임즈 Online shooting gamemethod andonline shooting game server performing the same
CN111035928A (en) * 2019-11-19 2020-04-21 腾讯科技(深圳)有限公司 Method and device for displaying travel route, storage medium and electronic device
CN111097166A (en) * 2019-12-12 2020-05-05 腾讯科技(深圳)有限公司 Control method and device of virtual operation object, storage medium and electronic device
CN111408132A (en) * 2020-02-17 2020-07-14 网易(杭州)网络有限公司 Game picture display method, device, equipment and storage medium
CN112076473A (en) * 2020-09-11 2020-12-15 腾讯科技(深圳)有限公司 Control method and device of virtual prop, electronic equipment and storage medium
CN113499590A (en) * 2021-08-04 2021-10-15 网易(杭州)网络有限公司 Method and device for controlling shooting in game, electronic equipment and readable medium

Also Published As

Publication number Publication date
CN114130006A (en) 2022-03-04

Similar Documents

Publication Publication Date Title
CN112076473B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN112090069B (en) Information prompting method and device in virtual scene, electronic equipment and storage medium
CN113181650B (en) Control method, device, equipment and storage medium for calling object in virtual scene
CN113797536B (en) Control method, device, equipment and storage medium for objects in virtual scene
WO2022105474A1 (en) State switching method and apparatus in virtual scene, device, medium, and program product
CN112121417B (en) Event processing method, device, equipment and storage medium in virtual scene
CN112121414B (en) Tracking method and device in virtual scene, electronic equipment and storage medium
JP7436707B2 (en) Information processing method, device, device, medium and computer program in virtual scene
CN112138385B (en) Virtual shooting prop aiming method and device, electronic equipment and storage medium
CN112057860B (en) Method, device, equipment and storage medium for activating operation control in virtual scene
CN113633964A (en) Virtual skill control method, device, equipment and computer readable storage medium
CN113181649A (en) Control method, device, equipment and storage medium for calling object in virtual scene
US20230033530A1 (en) Method and apparatus for acquiring position in virtual scene, device, medium and program product
CN114217708B (en) Control method, device, equipment and storage medium for opening operation in virtual scene
CN114130006B (en) Virtual prop control method, device, equipment, storage medium and program product
CN112121433B (en) Virtual prop processing method, device, equipment and computer readable storage medium
CN112870694B (en) Picture display method and device of virtual scene, electronic equipment and storage medium
CN112156472A (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN113633991B (en) Virtual skill control method, device, equipment and computer readable storage medium
CN116764196A (en) Processing method, device, equipment, medium and program product in virtual scene
CN113926194A (en) Method, apparatus, device, medium, and program product for displaying picture of virtual scene
CN114146413A (en) Virtual object control method, device, equipment, storage medium and program product
CN117635891A (en) Model display method, device, equipment and storage medium in virtual scene
CN117427336A (en) Information display method, device, equipment, medium and program product of virtual scene
CN112891930A (en) Information display method, device, equipment and storage medium in virtual scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant