CN110992453B - Scene object display method and device, electronic equipment and storage medium - Google Patents

Scene object display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110992453B
CN110992453B CN201911304420.8A CN201911304420A CN110992453B CN 110992453 B CN110992453 B CN 110992453B CN 201911304420 A CN201911304420 A CN 201911304420A CN 110992453 B CN110992453 B CN 110992453B
Authority
CN
China
Prior art keywords
crushing
animation
hit
information
matched
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911304420.8A
Other languages
Chinese (zh)
Other versions
CN110992453A (en
Inventor
汪皓浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mihoyo Tianming Technology Co Ltd
Original Assignee
Shanghai Mihoyo Tianming Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mihoyo Tianming Technology Co Ltd filed Critical Shanghai Mihoyo Tianming Technology Co Ltd
Priority to CN201911304420.8A priority Critical patent/CN110992453B/en
Publication of CN110992453A publication Critical patent/CN110992453A/en
Application granted granted Critical
Publication of CN110992453B publication Critical patent/CN110992453B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a method and a device for displaying scene objects, electronic equipment and a storage medium. The method comprises the following steps: after detecting the hit operation of a user on a target scene object, acquiring operation information matched with the hit operation; according to the operation information, acquiring object crushing information matched with the hit operation from object crushing information preset for the object of the target scene; and generating and displaying the crushing process animation matched with the object of the target scene according to the object crushing information. According to the embodiment of the invention, the real object scene object breaking process can be simulated after the object scene object is hit, so that a user can see the object scene object breaking process, the reality of interaction between the user and the game scene is improved, and the user experience is improved.

Description

Scene object display method and device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a method and a device for displaying scene objects, electronic equipment and a storage medium.
Background
With the development of network technology, the requirement of reality of interaction with a game scene in a game is increasing. In a fight-type game, a user may attack an animated character whose character type is an enemy. When a user performs an attack operation on an animated character whose character type is an enemy, a scene object may be hit. In order to improve user experience, after a user hits a scene object, a preset object breaking static animation is directly displayed, and the scene object is simulated to be hit.
In the prior art, only when a scene object is hit, static animation of object breakage can be directly displayed, the real scene object breakage process can not be simulated after the scene object is hit, the user can not generate the feeling of real interaction with a game scene, and the user experience is poor.
Disclosure of Invention
The embodiment of the invention provides a method, a device, electronic equipment and a storage medium for displaying scene objects, which are used for optimizing the existing display scheme of the scene objects, simulating the real scene object breaking process after the scene objects are hit, improving the authenticity of interaction between a user and a game scene and improving the user experience.
In a first aspect, an embodiment of the present invention provides a method for displaying a scene object, including:
after detecting the hit operation of a user on the object of the target scene, acquiring operation information matched with the hit operation;
according to the operation information, acquiring object crushing information matched with the hit operation from object crushing information preset for the object of the target scene;
and generating and displaying the crushing process animation matched with the object of the target scene according to the object crushing information.
In a second aspect, an embodiment of the present invention further provides a display apparatus for a scene object, including:
the operation detection module is used for acquiring operation information matched with the hit operation after detecting the hit operation of the user on the target scene object;
the information acquisition module is used for acquiring object crushing information matched with the hit operation from object crushing information preset for the object of the target scene according to the operation information;
and the animation display module is used for generating and displaying the animation of the crushing process matched with the object of the target scene according to the object crushing information.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
one or more processors;
a memory for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement a method for displaying scene objects according to an embodiment of the present invention.
In a fourth aspect, an embodiment of the present invention further provides a computer readable storage medium, where a computer program is stored, where the computer program when executed by a processor implements a method for displaying a scene object according to an embodiment of the present invention.
According to the technical scheme, after the hit operation of the user on the target scene object is detected, the operation information matched with the hit operation is acquired, then according to the operation information, the object crushing information matched with the hit operation is acquired in the object crushing information preset for the target scene object, and according to the object crushing information, the crushing process animation matched with the target scene object is generated for display, according to the hit operation of the user on the target scene object, the object crushing information matched with the hit operation is acquired, and according to the object crushing information, the crushing process animation matched with the target scene object is generated for display, so that the real target scene object crushing process is simulated after the target scene object is hit, the user sees the target scene object crushing process, the reality of interaction between the user and the game scene is improved, and the user experience is improved.
Drawings
FIG. 1 is a flowchart of a method for displaying scene objects according to an embodiment of the present invention;
FIG. 2 is a flowchart of a method for displaying scene objects according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a display device for a scene object according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof.
It should be further noted that, for convenience of description, only some, but not all of the matters related to the present invention are shown in the accompanying drawings. Before discussing exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Example 1
Fig. 1 is a flowchart of a method for displaying a scene object according to an embodiment of the invention. The embodiment is applicable to the situation of controlling the display effect of the scene object, and the method can be executed by the display device of the scene object provided by the embodiment of the invention, and the device can be realized in a software and/or hardware mode and can be generally integrated in electronic equipment. As shown in fig. 1, the method in this embodiment specifically includes:
step 101, after detecting the hit operation of the user on the target scene object, acquiring operation information matched with the hit operation.
Optionally, the scene object is hit when the user performs an attack operation on an animated character whose character type is an enemy. The target scene item is the scene item hit by the user. And after detecting the hit operation of the user on the target scene object, acquiring operation information matched with the hit operation.
In one specific example, a "monster" is an animated character in which the character type in the game is an enemy. In the current game scenario, the scenario items include: boxes, tables, and chairs. When the user performs an attack operation on the "monster" character type, the box is hit. The box hit by the user is the target scene object. And after detecting the hit operation of the user on the target scene object, acquiring operation information matched with the hit operation.
Optionally, the operation information includes: hit location and hit direction.
The hit location is the location where the target scene item was hit by the user. The hit direction is the direction of the hit operation.
In one specific example, a user hits a box from forward using a terabit in a virtual weapon for play. The hit location is the location where the box was hit by the tera knife. The hit direction is the direction of the too-knife hit operation: and (3) forward direction.
Step 102, according to the operation information, obtaining object breaking information matched with the hit operation from object breaking information preset for the object of the target scene.
Optionally, the object breaking information includes: dynamic crushing animation and static crushing animation.
Dynamic crush animation is the dynamic animation of the crush process after a scene object is hit. The static breaking animation is a static dynamic animation in which all parts of the scene object are scattered at fixed positions after the breaking process of the scene object is finished.
Optionally, before detecting the hit operation of the target scene object by the user, the method further includes: generating at least one piece of preset operation information according to the three-dimensional model of the object of the target scene; and establishing dynamic crushing animation and static crushing animation of the target scene object matched with each preset operation information according to the three-dimensional model of the target scene object and each preset operation information through the rigid body part of the physical engine.
Optionally, the operation information includes: hit location and hit direction.
Optionally, generating at least one preset operation information according to the three-dimensional model of the object scene object includes: setting at least one preset hit position aiming at the object of the target scene according to the three-dimensional model of the object of the target scene; setting at least one preset hit direction aiming at the object of the target scene according to the three-dimensional model of the object of the target scene; combining each preset hit position with each preset hit direction to obtain at least one piece of preset operation information, wherein each piece of preset operation information comprises a preset hit position and a preset hit direction.
Optionally, the dynamic crushing animation and the static crushing animation of the target scene object matched with each preset operation information are established according to the three-dimensional model of the target scene object and each preset operation information through the rigid body part of the physical engine, and the method comprises the following steps: simulating a physical crushing process of the target scene object in a hit scene matched with each piece of preset operation information according to the three-dimensional model of the target scene object and each piece of preset operation information through a rigid body part of the physical engine; and establishing dynamic crushing animation and static crushing animation of the target scene objects matched with each piece of preset operation information according to the related data in the simulated physical crushing process.
Optionally, according to the operation information, acquiring the object breaking information matched with the hit operation from the object breaking information preset for the object of the target scene may include: acquiring target preset operation information matched with the operation information in preset operation information aiming at the target scene object; and acquiring object crushing information of the object scene object matched with the target preset operation information, and taking the object crushing information as dynamic crushing animation and static crushing animation matched with the hit operation.
And 103, generating and displaying the crushing process animation matched with the object of the target scene according to the object crushing information.
Optionally, generating, according to the object crushing information, a crushing process animation matched with the object of the target scene for displaying, including: and displaying the dynamic crushing animation and the static crushing animation matched with the hit operation in sequence according to preset animation display logic.
Therefore, after the user hits the target scene object, according to the hit operation of the user on the target scene object, the dynamic breaking animation and the static breaking animation matched with the hit operation are sequentially displayed, namely, the complete breaking process animation is displayed. After the scene objects are hit, the target scene objects start to be dynamically broken, and after the dynamic breaking process is finished, all parts of the scene objects are scattered at fixed positions, so that the real target scene object breaking process is simulated after the target scene objects are hit, a user can see the target scene object breaking process, the authenticity of interaction between the user and a game scene is improved, and the user experience is improved.
Optionally, the method for displaying scene objects according to the embodiment of the present invention is performed by a preset scene object display plug-in.
The scene object display plug-in is used for displaying complete animation in the breaking process according to the hitting operation of the user on the target scene object after the user hits the scene object.
A scene object display plug-in is preset. After detecting the hit operation of a user on a target scene object, the scene object display plug-in acquires operation information matched with the hit operation; the scene object display plug-in acquires object crushing information matched with the hit operation from object crushing information preset for the target scene object according to the operation information; and the scene object display plug-in generates and displays a crushing process animation matched with the target scene object according to the object crushing information.
Optionally, after detecting a hit operation of the user on the target scene object, the scene object display plug-in obtains operation information matched with the hit operation, where the operation information includes: hit location and hit direction; the scene object display plug-in acquires target preset operation information matched with the operation information in preset operation information aiming at the target scene object; the scene object display plug-in acquires dynamic crushing animation and static crushing animation of the target scene object matched with the target preset operation information as dynamic crushing animation and static crushing animation matched with the hit operation; the scene object display plug-in sequentially displays dynamic crushing animation and static crushing animation matched with the hit operation according to preset animation display logic.
Therefore, after the user hits the target scene object, the scene object display plug-in sequentially displays the dynamic crushing animation and the static crushing animation matched with the hit operation according to the hit operation of the user on the target scene object, namely, displays the complete crushing process animation: after the scene objects are hit, the target scene objects start to be dynamically crushed, and after the dynamic crushing process is finished, all parts of the scene objects are scattered at fixed positions, so that the real target scene object crushing process is simulated after the target scene objects are hit, and a user can see the target scene object crushing process.
The embodiment of the invention provides a scene object display method, which comprises the steps of acquiring operation information matched with a hit operation after detecting the hit operation of a user on a target scene object, acquiring object crushing information matched with the hit operation in object crushing information preset for the target scene object according to the operation information, generating a crushing process animation matched with the target scene object according to the object crushing information for display, acquiring the object crushing information matched with the hit operation according to the hit operation of the user on the target scene object, generating a crushing process animation matched with the target scene object according to the object crushing information for display, so that a user can see the target scene object crushing process after the target scene object is hit, the reality of interaction between the user and a game scene is improved, and the user experience is improved.
Example two
Fig. 2 is a flowchart of a method for displaying scene objects according to a second embodiment of the present invention. This embodiment may be combined with each of the alternatives of one or more of the embodiments described above, where the object breaking information includes: dynamic crushing animation and static crushing animation.
And before detecting the hit operation of the target scene object by the user, further comprising: generating at least one piece of preset operation information according to the three-dimensional model of the object of the target scene; and establishing dynamic crushing animation and static crushing animation of the target scene object matched with each preset operation information according to the three-dimensional model of the target scene object and each preset operation information through the rigid body part of the physical engine.
And according to the operation information, acquiring the object crushing information matched with the hit operation from the object crushing information preset for the object of the target scene, which may include: acquiring target preset operation information matched with the operation information in preset operation information aiming at the target scene object; and acquiring dynamic crushing animation and static crushing animation of the target scene object matched with the target preset operation information as dynamic crushing animation and static crushing animation matched with the hit operation.
And generating a crushing process animation matched with the object of the target scene for display according to the object crushing information, which can comprise the following steps: and displaying the dynamic crushing animation and the static crushing animation matched with the hit operation in sequence according to preset animation display logic.
As shown in fig. 2, the method in this embodiment specifically includes:
step 201, generating at least one piece of preset operation information according to a three-dimensional model of the object of the target scene.
Optionally, the operation information includes: hit location and hit direction.
Optionally, generating at least one preset operation information according to the three-dimensional model of the object scene object includes: setting at least one preset hit position aiming at the object of the target scene according to the three-dimensional model of the object of the target scene; setting at least one preset hit direction aiming at the object of the target scene according to the three-dimensional model of the object of the target scene; combining each preset hit position with each preset hit direction to obtain at least one piece of preset operation information, wherein each piece of preset operation information comprises a preset hit position and a preset hit direction.
Step 202, establishing dynamic crushing animation and static crushing animation of the target scene object matched with each preset operation information according to the three-dimensional model of the target scene object and each preset operation information through the rigid body part of the physical engine.
Optionally, the object breaking information includes: dynamic crushing animation and static crushing animation.
Dynamic crush animation is the dynamic animation of the crush process after a scene object is hit. The static breaking animation is a static dynamic animation in which all parts of the scene object are scattered at fixed positions after the breaking process of the scene object is finished.
Optionally, the dynamic crushing animation and the static crushing animation of the target scene object matched with each preset operation information are established according to the three-dimensional model of the target scene object and each preset operation information through the rigid body part of the physical engine, and the method comprises the following steps: simulating a physical crushing process of the target scene object in a hit scene matched with each piece of preset operation information according to the three-dimensional model of the target scene object and each piece of preset operation information through a rigid body part of the physical engine; and establishing dynamic crushing animation and static crushing animation of the target scene objects matched with each piece of preset operation information according to the related data in the simulated physical crushing process.
Step 203, after detecting the hit operation of the user on the object of the target scene, acquiring operation information matched with the hit operation.
Optionally, the operation information includes: hit location and hit direction.
The hit location is the location where the target scene item was hit by the user. The hit direction is the direction of the hit operation.
In one specific example, a user hits a box from forward using a terabit in a virtual weapon for play. The hit location is the location where the box was hit by the tera knife. The hit direction is the direction of the too-knife hit operation: and (3) forward direction.
Step 204, acquiring target preset operation information matched with the operation information in the preset operation information aiming at the target scene object.
Optionally, in the preset operation information aiming at the object of the target scene, whether each preset operation information is matched with the operation information is sequentially judged, and the target preset operation information matched with the operation information is obtained.
Step 205, obtaining dynamic crushing animation and static crushing animation of the object scene object matched with the target preset operation information, and taking the dynamic crushing animation and the static crushing animation matched with the hit operation as the dynamic crushing animation and the static crushing animation.
Step 206, displaying the dynamic breaking animation and the static breaking animation matched with the hit operation in sequence according to the preset animation display logic.
Therefore, after the user hits the target scene object, according to the hit operation of the user on the target scene object, the dynamic breaking animation and the static breaking animation matched with the hit operation are sequentially displayed, namely, the complete breaking process animation is displayed. After the scene objects are hit, the target scene objects start to be dynamically broken, and after the dynamic breaking process is finished, all parts of the scene objects are scattered at fixed positions, so that the real target scene object breaking process is simulated after the target scene objects are hit, a user can see the target scene object breaking process, the authenticity of interaction between the user and a game scene is improved, and the user experience is improved.
The embodiment of the invention provides a method for displaying scene objects, which comprises the steps of establishing dynamic crushing animation and static crushing animation of the object scene objects matched with all preset operation information according to a three-dimensional model of the object scene objects and all preset operation information through a rigid body part of a physical engine, acquiring the object preset operation information matched with the operation information in the preset operation information aiming at the object scene objects, acquiring the dynamic crushing animation and the static crushing animation of the object scene objects matched with the object preset operation information as the dynamic crushing animation and the static crushing animation matched with the hit operation, sequentially displaying the dynamic crushing animation and the static crushing animation matched with the hit operation according to preset animation display logic, pre-establishing the dynamic crushing animation and the static crushing animation of the object scene objects matched with all preset operation information, acquiring the dynamic crushing animation and the static crushing animation matched with the hit operation according to the hit operation of a user on the object scene objects, and displaying the object scene objects, so that the user sees the real object scene object crushing process after the object scene objects are hit, thereby improving the reality of the user interaction of the object scene, and improving the user experience.
Example III
Fig. 3 is a schematic structural diagram of a display device for a scene object according to a third embodiment of the present invention. As shown in fig. 3, the apparatus may be configured in an electronic device, including: an operation detection module 301, an information acquisition module 302, and an animation display module 303.
The operation detection module 301 is configured to obtain operation information matched with a hit operation after detecting the hit operation of the user on the target scene object; the information obtaining module 302 is configured to obtain, according to the operation information, object breaking information matched with the hit operation from object breaking information preset for the object of the target scene; and the animation display module 303 is used for generating and displaying the animation of the crushing process matched with the object of the target scene according to the object crushing information.
The embodiment of the invention provides a scene object display device, which is characterized in that after the hit operation of a user on a target scene object is detected, operation information matched with the hit operation is acquired, then according to the operation information, object crushing information matched with the hit operation is acquired in object crushing information preset for the target scene object, and according to the object crushing information, a crushing process animation matched with the target scene object is generated for display, according to the hit operation of the user on the target scene object, the object crushing information matched with the hit operation is acquired, and according to the object crushing information, a crushing process animation matched with the target scene object is generated for display, so that the real target scene object crushing process is simulated after the target scene object is hit, the user sees the target scene object crushing process, the reality of interaction of the user and a game scene is improved, and the user experience is improved.
On the basis of the above embodiments, the operation information may include: hit location and hit direction.
On the basis of the above embodiments, the object crushing information may include: dynamic crushing animation and static crushing animation.
On the basis of the above embodiments, the display device for a scene object may further include: the operation information generation module is used for generating at least one piece of preset operation information according to the three-dimensional model of the object of the target scene; the animation establishing module is used for establishing dynamic crushing animation and static crushing animation of the target scene object matched with each preset operation information according to the three-dimensional model of the target scene object and each preset operation information through the rigid body part of the physical engine.
Based on the above embodiments, the information obtaining module 302 may include: the information matching unit is used for acquiring target preset operation information matched with the operation information in the preset operation information aiming at the target scene object; and the information matching unit is used for acquiring the dynamic crushing animation and the static crushing animation of the target scene object matched with the target preset operation information as the dynamic crushing animation and the static crushing animation matched with the hit operation.
On the basis of the above embodiments, the animation display module 303 may include: and the animation display unit is used for sequentially displaying the dynamic crushing animation and the static crushing animation matched with the hit operation according to preset animation display logic.
On the basis of the above embodiments, the method for displaying scene objects is performed by a preset scene object display plug-in.
The display device for the scene objects can execute the display method for the scene objects provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the display method for the scene objects.
Example IV
Fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention. Fig. 4 illustrates a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 4 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 4, the electronic device 12 is in the form of a general-purpose electronic device. Components of the electronic device 12 may include, but are not limited to: one or more processors 16, a memory 28, a bus 18 that connects the various system components, including the memory 28 and the processor 16. The processor 16 includes, but is not limited to, an AI processor.
Bus 18 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, micro channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 12 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, commonly referred to as a "hard disk drive"). Although not shown in fig. 4, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 18 through one or more data medium interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored in, for example, memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 42 generally perform the functions and/or methods of the embodiments described herein.
The electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), one or more devices that enable a user to interact with the electronic device 12, and/or any devices (e.g., network card, modem, etc.) that enable the electronic device 12 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through a network adapter 20. As shown, the network adapter 20 communicates with other modules of the electronic device 12 over the bus 18. It should be appreciated that although not shown in fig. 4, other hardware and/or software modules may be used in connection with electronic device 12, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processor 16 of the electronic device 12 executes programs stored in the memory 28 to perform various functional applications and data processing, such as implementing the method of displaying scene objects provided by embodiments of the present invention. The method specifically comprises the following steps: after detecting the hit operation of a user on the object of the target scene, acquiring operation information matched with the hit operation; according to the operation information, acquiring object crushing information matched with the hit operation from object crushing information preset for the object of the target scene; and generating and displaying the crushing process animation matched with the object of the target scene according to the object crushing information.
Example five
The fifth embodiment of the present invention further provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the method for displaying scene objects provided by the embodiments of the present invention. The method specifically comprises the following steps: after detecting the hit operation of a user on the object of the target scene, acquiring operation information matched with the hit operation; according to the operation information, acquiring object crushing information matched with the hit operation from object crushing information preset for the object of the target scene; and generating and displaying the crushing process animation matched with the object of the target scene according to the object crushing information.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
The computer program code for carrying out operations of the present invention may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++, ruby, go and conventional procedural programming languages, such as the "C" programming language or similar programming languages, and computer languages of AI algorithms. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (7)

1. A method for displaying scene objects, comprising:
after detecting the hit operation of a user on a target scene object, acquiring operation information matched with the hit operation;
according to the operation information, acquiring object crushing information matched with the hit operation from object crushing information preset for the object of the target scene;
generating a crushing process animation matched with the object of the target scene according to the object crushing information for display;
the operation information includes: hit location and hit direction; the object crushing information includes: dynamic crushing animation and static crushing animation;
according to the object crushing information, generating and displaying a crushing process animation matched with the object of the target scene, wherein the method comprises the following steps:
and sequentially displaying the dynamic crushing animation and the static crushing animation matched with the hit operation according to preset animation display logic.
2. The method of claim 1, further comprising, prior to detecting a user hit on the target scene item:
generating at least one piece of preset operation information according to the three-dimensional model of the target scene object;
and establishing dynamic crushing animation and static crushing animation of the target scene object matched with each preset operation information according to the three-dimensional model of the target scene object and each preset operation information through a rigid body part of a physical engine.
3. The method according to claim 2, wherein acquiring, from the operation information, object crushing information matched with the hit operation among object crushing information preset for the target scene object, includes:
acquiring target preset operation information matched with the operation information in preset operation information aiming at target scene objects;
and acquiring dynamic crushing animation and static crushing animation of the target scene object matched with the target preset operation information as dynamic crushing animation and static crushing animation matched with the hit operation.
4. A method according to any of claims 1-3, wherein the method is performed by a preset scene item display plug-in.
5. A display device for scene objects, comprising:
the operation detection module is used for acquiring operation information matched with the hit operation after detecting the hit operation of a user on the target scene object;
the information acquisition module is used for acquiring object crushing information matched with the hit operation from object crushing information preset for the object of the target scene according to the operation information;
the animation display module is used for generating and displaying the animation of the crushing process matched with the object of the target scene according to the object crushing information;
the operation information includes: hit location and hit direction; the object crushing information includes: dynamic crushing animation and static crushing animation;
the animation display module comprises: and the animation display unit is used for sequentially displaying the dynamic crushing animation and the static crushing animation matched with the hit operation according to preset animation display logic.
6. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of displaying scene items as recited in any of claims 1-4.
7. A computer readable storage medium having stored thereon a computer program, which when executed by a processor implements a method of displaying scene items according to any of claims 1-4.
CN201911304420.8A 2019-12-17 2019-12-17 Scene object display method and device, electronic equipment and storage medium Active CN110992453B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911304420.8A CN110992453B (en) 2019-12-17 2019-12-17 Scene object display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911304420.8A CN110992453B (en) 2019-12-17 2019-12-17 Scene object display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110992453A CN110992453A (en) 2020-04-10
CN110992453B true CN110992453B (en) 2024-01-23

Family

ID=70094864

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911304420.8A Active CN110992453B (en) 2019-12-17 2019-12-17 Scene object display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110992453B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111603774B (en) * 2020-06-29 2023-09-26 网易(杭州)网络有限公司 Method and device for crushing virtual resources in game
CN111760273B (en) * 2020-07-06 2024-01-30 网易(杭州)网络有限公司 Game fragment processing method, device and equipment

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1254609A (en) * 1999-12-03 2000-05-31 陈少元 Simulative laser shooting system
CN1348829A (en) * 2001-11-07 2002-05-15 陈少元 Multi-target multi-shooter simulated laser shooting system
TW200843822A (en) * 2007-05-04 2008-11-16 Int Games System Co Ltd Equipment for electronic game
JP2015054017A (en) * 2013-09-11 2015-03-23 株式会社平和 Game machine
CN106714918A (en) * 2015-08-18 2017-05-24 洪国际有限公司 Method, apparatus, and computer program stored in computer-readable medium for providing dart game match-up mode with virtual player
CN107019909A (en) * 2017-04-13 2017-08-08 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and computer-readable recording medium
US9827496B1 (en) * 2015-03-27 2017-11-28 Electronics Arts, Inc. System for example-based motion synthesis
CN107908281A (en) * 2017-11-06 2018-04-13 北京小米移动软件有限公司 Virtual reality exchange method, device and computer-readable recording medium
CN108416830A (en) * 2018-02-26 2018-08-17 网易(杭州)网络有限公司 Animation display control method, device, equipment and storage medium
CN208065755U (en) * 2017-11-21 2018-11-09 深圳市至高乐商业管理有限公司 A kind of sound amusement facility
CN109107154A (en) * 2018-08-02 2019-01-01 腾讯科技(深圳)有限公司 Virtual item control method for movement, device, electronic device and storage medium
CN109107160A (en) * 2018-08-27 2019-01-01 广州要玩娱乐网络技术股份有限公司 Animation exchange method, device, computer storage medium and terminal
CN109200582A (en) * 2018-08-02 2019-01-15 腾讯科技(深圳)有限公司 The method, apparatus and storage medium that control virtual objects are interacted with ammunition
CN109414617A (en) * 2016-08-31 2019-03-01 株式会社万代 Game device, game article and program
CN110062271A (en) * 2019-04-28 2019-07-26 腾讯科技(成都)有限公司 Method for changing scenes, device, terminal and storage medium
WO2019205793A1 (en) * 2018-04-24 2019-10-31 网易(杭州)网络有限公司 Information processing method and device in virtual reality game, and processor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8556720B2 (en) * 2008-01-14 2013-10-15 Disney Enterprises, Inc. System and method for touchscreen video game combat

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1254609A (en) * 1999-12-03 2000-05-31 陈少元 Simulative laser shooting system
CN1348829A (en) * 2001-11-07 2002-05-15 陈少元 Multi-target multi-shooter simulated laser shooting system
TW200843822A (en) * 2007-05-04 2008-11-16 Int Games System Co Ltd Equipment for electronic game
JP2015054017A (en) * 2013-09-11 2015-03-23 株式会社平和 Game machine
US9827496B1 (en) * 2015-03-27 2017-11-28 Electronics Arts, Inc. System for example-based motion synthesis
CN106714918A (en) * 2015-08-18 2017-05-24 洪国际有限公司 Method, apparatus, and computer program stored in computer-readable medium for providing dart game match-up mode with virtual player
CN109414617A (en) * 2016-08-31 2019-03-01 株式会社万代 Game device, game article and program
CN107019909A (en) * 2017-04-13 2017-08-08 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and computer-readable recording medium
CN107908281A (en) * 2017-11-06 2018-04-13 北京小米移动软件有限公司 Virtual reality exchange method, device and computer-readable recording medium
CN208065755U (en) * 2017-11-21 2018-11-09 深圳市至高乐商业管理有限公司 A kind of sound amusement facility
CN108416830A (en) * 2018-02-26 2018-08-17 网易(杭州)网络有限公司 Animation display control method, device, equipment and storage medium
WO2019205793A1 (en) * 2018-04-24 2019-10-31 网易(杭州)网络有限公司 Information processing method and device in virtual reality game, and processor
CN109200582A (en) * 2018-08-02 2019-01-15 腾讯科技(深圳)有限公司 The method, apparatus and storage medium that control virtual objects are interacted with ammunition
CN109107154A (en) * 2018-08-02 2019-01-01 腾讯科技(深圳)有限公司 Virtual item control method for movement, device, electronic device and storage medium
CN109107160A (en) * 2018-08-27 2019-01-01 广州要玩娱乐网络技术股份有限公司 Animation exchange method, device, computer storage medium and terminal
CN110062271A (en) * 2019-04-28 2019-07-26 腾讯科技(成都)有限公司 Method for changing scenes, device, terminal and storage medium

Also Published As

Publication number Publication date
CN110992453A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
US9766707B2 (en) Method for using the GPU to create haptic friction maps
CN110969687B (en) Collision detection method, device, equipment and medium
US20130155106A1 (en) Method and system for coordinating collisions between augmented reality and real reality
US20080231631A1 (en) Image processing apparatus and method of controlling operation of same
CN110473293B (en) Virtual object processing method and device, storage medium and electronic equipment
CN110992453B (en) Scene object display method and device, electronic equipment and storage medium
CN109685884A (en) A kind of three-dimensional modeling method and system based on virtual reality
CN111481923B (en) Rocker display method and device, computer storage medium and electronic equipment
CN114387400A (en) Three-dimensional scene display method, display device, electronic equipment and server
CN106984044B (en) Method and equipment for starting preset process
CN110215686B (en) Display control method and device in game scene, storage medium and electronic equipment
EP3956752B1 (en) Semantic-augmented artificial-reality experience
CN113178017A (en) AR data display method and device, electronic equipment and storage medium
CN110227255B (en) Interactive control method and device for virtual container in VR game and electronic device
CN111080756B (en) Interactive animation generation method, device, equipment and medium
CN112156467A (en) Control method and system of virtual camera, storage medium and terminal equipment
EP3373250B1 (en) Method and portable electronic device for changing graphics processing resolution based on scenario
CN109739648B (en) Animation playing control method, device, equipment and storage medium
CN109461203B (en) Gesture three-dimensional image generation method and device, computer equipment and storage medium
CN113694522B (en) Method and device for processing crushing effect, storage medium and electronic equipment
CN113694518B (en) Freezing effect processing method and device, storage medium and electronic equipment
CN111063012A (en) Animation character display method and device, electronic equipment and storage medium
CN114092608A (en) Expression processing method and device, computer readable storage medium and electronic equipment
CN112230766A (en) Shopping method and system based on combination of AR and AI
CN113112613B (en) Model display method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant