CN115713586A - Method and device for generating fragmentation animation and storage medium - Google Patents

Method and device for generating fragmentation animation and storage medium Download PDF

Info

Publication number
CN115713586A
CN115713586A CN202211427993.1A CN202211427993A CN115713586A CN 115713586 A CN115713586 A CN 115713586A CN 202211427993 A CN202211427993 A CN 202211427993A CN 115713586 A CN115713586 A CN 115713586A
Authority
CN
China
Prior art keywords
color
animation
fragmentation
texture
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211427993.1A
Other languages
Chinese (zh)
Inventor
曾灏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211427993.1A priority Critical patent/CN115713586A/en
Publication of CN115713586A publication Critical patent/CN115713586A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method and a device for generating a fragmentation animation and a storage medium. The method comprises the following steps: acquiring a crack mapping and a target model; cutting the target model based on the crack mapping to obtain a fragment model; respectively carrying out texture sampling on the target model and the fragment model to obtain a first texture coordinate corresponding to the target model and a second texture coordinate corresponding to the fragment model; and generating a three-dimensional fragmentation animation based on the crack map, the first texture coordinates and the second texture coordinates. The invention solves the technical problem of poor user experience caused by poor display effect of fragmentation effect in the prior art.

Description

Method and device for generating fragmentation animation and storage medium
Technical Field
The invention relates to the field of three-dimensional animation production, in particular to a method and a device for generating a fragmentation animation and a storage medium.
Background
With the progress of science and technology and the popularization of intelligent terminals, people have higher and higher requirements on the display effect on the screen of the terminal. However, in the prior art, the display effect of displaying the fragmentation effect in the terminal is poor, which results in poor user experience.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
At least some embodiments of the present invention provide a fragmentation animation generation method, apparatus, and storage medium, so as to at least solve the technical problem in the prior art that a fragmentation effect display effect is poor, which results in poor user experience.
According to an embodiment of the present invention, there is provided a fragmentation animation generation method, including: acquiring a crack mapping and a target model; cutting the target model based on the crack mapping to obtain a fragment model; respectively carrying out texture sampling on the target model and the fragment model to obtain a first texture coordinate corresponding to the target model and a second texture coordinate corresponding to the fragment model; and generating a three-dimensional fragmentation animation based on the crack map, the first texture coordinates and the second texture coordinates.
According to an embodiment of the present invention, there is provided a fragmentation animation generation apparatus, including: the acquisition module is used for acquiring a crack map and a target model; the cutting module is used for cutting the target model based on the crack mapping to obtain a fragment model; the sampling module is used for respectively carrying out texture sampling on the target model and the fragment model to obtain a first texture coordinate corresponding to the target model and a second texture coordinate corresponding to the fragment model; and the generating module is used for generating the three-dimensional fragmentation animation based on the crack map, the first texture coordinate and the second texture coordinate.
According to an embodiment of the present invention, there is provided a non-volatile storage medium having a computer program stored therein, wherein the computer program is configured to be executed by a processor to perform the fragmentation animation generation method in the embodiment of the present invention.
According to an embodiment of the present invention, there is provided an electronic apparatus including a memory and a processor, the memory storing a computer program therein, the processor being configured to execute the computer program to perform the fragmentation animation generation method in the embodiment of the present invention.
In at least some embodiments of the invention, the crack map and the target model are obtained; cutting the target model based on the crack mapping to obtain a fragment model; respectively carrying out texture sampling on the target model and the fragment model to obtain a first texture coordinate corresponding to the target model and a second texture coordinate corresponding to the fragment model; based on crack mapping, the first texture coordinate and the second texture coordinate, the three-dimensional fragmentation animation is generated, and the purpose of generating the three-dimensional fragmentation animation is achieved, so that the technical effect of improving the art effect of the fragmentation effect is achieved, and the technical problem that in the prior art, the display effect of the fragmentation effect is poor, and the user experience is poor is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a block diagram of a hardware configuration of a mobile terminal of a fragmentation animation generation method according to an embodiment of the present invention;
FIG. 2 is a flow diagram of a method of generating a fragmentation animation according to one embodiment of the invention;
FIG. 3 is a diagram illustrating a target model in a fragmentation animation generation method according to an embodiment of the present invention;
FIG. 4A is an initial map in a method for generating a fragmentation animation according to an embodiment of the present invention;
FIG. 4B is an initial map with partial lighting in a method for generating a fragmentation animation according to an embodiment of the present invention;
FIG. 4C is a crack map in a method for generating a fragmentation animation according to an embodiment of the invention;
FIG. 5 is a texture map of a target model in a method for generating a fragmentation animation according to an embodiment of the present invention;
FIG. 6 is a texture diagram of a fragment model in a method for generating a fragmentation animation according to an embodiment of the present invention;
FIG. 7 is a three-dimensional fragmentation animation of a fragment model in a method of generating a fragmentation animation according to an embodiment of the invention;
FIG. 8 is a block diagram of a fragmentation animation generation apparatus according to one embodiment of the present invention;
fig. 9 is a schematic diagram of an electronic device according to an embodiment of the invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions of the present invention, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with one embodiment of the present invention, there is provided an embodiment of a method for generating a fragmentation animation, where the steps illustrated in the flow chart of the figures may be performed in a computer system such as a set of computer executable instructions, and where, although a logical order is illustrated in the flow chart, in some cases, the steps illustrated or described may be performed in an order different than here.
The method embodiments may be performed in a mobile terminal, a computer terminal or a similar computing device. Taking the example of the mobile terminal running on the mobile terminal, the mobile terminal may be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, a mobile internet device (MID for short), a PAD, a game machine, etc. Fig. 1 is a block diagram of a hardware configuration of a mobile terminal according to a method for generating a fragmentation animation according to an embodiment of the present invention. As shown in fig. 1, the mobile terminal may include one or more (only one shown in fig. 1) processors 102 (the processors 102 may include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a Digital Signal Processing (DSP) chip, a Microprocessor (MCU), a programmable logic device (FPGA), a neural Network Processor (NPU), a Tensor Processor (TPU), an Artificial Intelligence (AI) type processor, etc.) and a memory 104 for storing data. Optionally, the mobile terminal may further include a transmission device 106, an input/output device 108, and a display device 110 for communication functions. It will be understood by those of ordinary skill in the art that the structure shown in fig. 1 is only an illustration and is not intended to limit the structure of the mobile terminal. For example, the mobile terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program and a module of application software, such as a computer program corresponding to the fragmentation animation generation method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer program stored in the memory 104, that is, implements the fragmentation animation generation method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet via wireless.
The inputs in the input output Device 108 may come from a plurality of Human Interface Devices (HIDs). For example: keyboard and mouse, game pad, other special game controller (such as steering wheel, fishing rod, dance mat, remote controller, etc.). Some human interface devices may provide output functions in addition to input functions, such as: force feedback and vibration of the gamepad, audio output of the controller, etc.
The display device 110 may be, for example, a head-up display (HUD), a touch screen type Liquid Crystal Display (LCD), and a touch display (also referred to as a "touch screen" or "touch display screen"). The liquid crystal display may enable a user to interact with a user interface of the mobile terminal. In some embodiments, the mobile terminal has a Graphical User Interface (GUI) with which a user can interact by touching finger contacts and/or gestures on a touch-sensitive surface, where the human-machine interaction function optionally includes the following interactions: executable instructions for creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, emailing, call interfacing, playing digital video, playing digital music, and/or web browsing, etc., for performing the above-described human-computer interaction functions, are configured/stored in one or more processor-executable computer program products or readable storage media.
The method for generating the fragmentation animation in one embodiment of the disclosure can run on a local terminal device or a server. When the method for generating the fragmentation animation runs on the server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In a possible implementation manner, an embodiment of the present invention provides a method for generating a fragmentation animation, and fig. 2 is a flowchart of generation of a three-dimensional fragmentation animation according to an embodiment of the present invention, as shown in fig. 2, the method includes the following steps:
and step S202, acquiring a crack map and a target model.
Specifically, the crack map may be a crack pattern of a fragile article, for example, a crack pattern of a fragile article such as a crack pattern when glass is broken or a crack pattern when crystal is broken. The above-mentioned three-dimensional model corresponding to the target model and the crack map may be, for example, a glass model if the crack map is a glass crack map, and the target model may be a glass model of a piece of glass or a model of a glass product, for example, a wine glass, a table top, or the like.
As an alternative embodiment, in the case where the target model is a glass model of a piece of glass, the target model may be as shown in fig. 3.
Optionally, generating a crack map corresponding to the three-dimensional fracture animation comprises: constructing cracks in different directions to generate an initial mapping; and carrying out illumination treatment on the initial map to obtain the crack map. For example, in determining the three-dimensional fracture animation to be achieved, first, crack textures in different directions are constructed according to the three-dimensional fracture animation to be achieved, so that an initial mapping as shown in fig. 4A can be obtained, and as shown in fig. 4A, white lines in fig. 4A are used for representing the drawn cracks.
Optionally, performing light treatment on the initial map to obtain a crack map, including: determining a target fragment with a preset shape in the initial mapping; and carrying out illumination treatment on partial area of the area where the target fragment is located to obtain a crack map. The fragments in the initial map refer to regions consisting of cracks in different directions. The preset shape can be a shape preset by an art worker.
As an alternative embodiment, after the initial map is obtained, the initial map is subjected to lighting processing, so that a lighting effect is added on the initial map, and the stereoscopic impression of the crack texture is enhanced, as shown in fig. 4B, the lower part of fig. 4B is a part to which the lighting effect has been added, the upper part of fig. 4B is an initial map to which the lighting effect has not been added, and after the lighting effect is added to the entire initial map shown in fig. 4A, the crack map shown in fig. 4C can be obtained.
And step S204, cutting the target model based on the crack map to obtain a fragment model.
In an alternative embodiment, the target model may be cut according to the crack trend in the crack map, so as to obtain a plurality of fragment models.
Step S206, texture sampling is respectively carried out on the target model and the fragment model, and a first texture coordinate corresponding to the target model and a second texture coordinate corresponding to the fragment model are obtained.
In an optional embodiment, the first texture coordinates corresponding to the target model are used for representing the UV coordinates corresponding to the target model, and the second texture coordinates corresponding to the debris model are used for representing the UV coordinates of the debris model.
As an alternative embodiment, after obtaining the target model and the fragment model, a glass Texture sampling (Texture Sample) may be performed on the fragment model to obtain a Texture map of the glass fragments as shown in fig. 5, and small irregular fragments in fig. 5 are used to represent the Texture of the fragment model. Further, texture sampling may be performed on the target model to obtain a texture map of the target model as shown in fig. 6.
And step S208, generating a three-dimensional fragmentation animation based on the crack map, the first texture coordinate and the second texture coordinate.
As an alternative implementation, texture sampling may be performed on the crack map by using the first texture coordinate, so as to obtain a color of an R channel of the crack map, that is, a first preset channel color, where the color of the R channel may be used as a self-luminous color of the fragment model; the preset ablation map can be subjected to texture sampling by using the second texture coordinate, so that the color of the R channel of the preset ablation map, namely the color of the second preset channel, is obtained. And texture sampling can be carried out on the preset refraction mapping Noise based on the first texture coordinate to obtain the color of the R channel of the preset refraction mapping, namely the color of the third channel. After the color of the R channel of the preset ablation map is obtained, the ablation function MF _ disalve may be used to process the color of the second channel to obtain an ablation result MF _ disalveresult, and then the first channel color is multiplied by the ablation result MF _ disalveresult to obtain a product of the first channel color and the ablation result MF _ disalveresult, which is used as the opacity of the debris model, i.e., the first target color.
Then, the product of the ablation result MF _ resolver result and the third channel color is obtained, and a second target color is obtained, and the second target color can be used as the refraction of the debris model.
And finally, performing vertex animation processing on the fragment model based on the chartlet color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
As an optional implementation manner, the three-dimensional fragmentation animation generated by the method provided by the invention can achieve a better three-dimensional artistic effect as shown in fig. 7.
In the embodiment of the invention, after the crack map and the target model are obtained, the target model is cut based on the crack map to obtain the three-dimensional fragment model, so that the subsequently generated three-dimensional fragmentation animation is more vivid, then, the target model and the fragment model are sampled to obtain the first texture coordinate and the second texture coordinate, and finally, the three-dimensional fragmentation animation with more stereoscopic impression is obtained by rendering based on the crack map, the first texture coordinate and the second texture coordinate, so that the technical effect of improving the user experience is achieved, and the technical problem of poor user experience caused by poor display effect of the fragmentation effect in the prior art is solved.
Optionally, generating a three-dimensional fragmentation animation based on the crack map, the first texture coordinates, and the second texture coordinates, comprising: performing texture sampling on the crack mapping by using the first texture coordinate to obtain mapping color of the crack mapping; and performing vertex animation processing on the fragment model based on the chartlet color and the second texture coordinate to obtain the three-dimensional fragmentation animation.
As an alternative embodiment, the map color may be a color of an R channel obtained by texture sampling of the crack map using the first texture coordinate, and the map color may be a self-luminous color of the fragment model.
In the optional embodiment, texture sampling is performed based on the first texture coordinate and the crack map to obtain the map color, so that vertex animation processing based on the map color and the second texture coordinate is facilitated, and the generation efficiency of the three-dimensional fragmentation animation is improved.
Optionally, performing vertex animation processing on the fragment model based on the chartlet color and the second texture coordinate to obtain a three-dimensional fragmentation animation, including: acquiring a control offset degree, wherein the control offset degree is used for controlling the offset degree of the fragment model; and performing vertex animation processing on the fragment model based on the mapping color, the second texture coordinate and the control offset degree to obtain the three-dimensional fragmentation animation.
As an alternative embodiment, the above control Offset degree may be a WPO (World Position Offset) parameter, and may be a global Position Offset of the fragmentation model.
Optionally, performing vertex animation processing on the fragment model based on the map color, the second texture coordinate and the offset degree to obtain a three-dimensional fragmentation animation, including: performing texture sampling on the crack map by using the first texture coordinate to obtain a first channel color, wherein the first channel color is the color of a first preset channel of the crack map; texture sampling is carried out on the preset ablation map by using a second texture coordinate to obtain a second channel color, wherein the second channel color is the color of a second preset channel of the preset ablation map; processing the color of the second channel by using an ablation function to generate an ablation result; obtaining the product of the ablation result and the first channel color to obtain a first target color; and performing vertex animation processing on the fragment model based on the chartlet color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
As an alternative, the first preset passage and the second preset passage may be R passages. The ablation map may be a map for controlling ablation effects. The first target color may be used as the opacity of the patch model.
Optionally, performing vertex animation processing on the fragment model based on the map color, the first target color, the second texture coordinate, and the offset degree to obtain a three-dimensional fragmentation animation, including: performing texture sampling on the preset refraction mapping based on the first texture coordinate to obtain a third channel color, wherein the third channel color is the color of a third preset channel of the preset refraction mapping; obtaining the product of the ablation result and the third channel color to obtain a second target color; and performing vertex animation processing on the fragment model based on the chartlet color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
As an optional embodiment, the third preset channel is an R channel. The preset refraction map is used for reserving black, white and gray information of the model and calculating the refraction of the fragment model. The second target color may be a refraction of the debris model.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
In this embodiment, a fragmentation animation generation device is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, and the description of the device that has been already made is omitted. As used hereinafter, the terms "unit", "module" and "modules" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 8 is a block diagram of a fragmentation animation generation apparatus according to an embodiment of the present invention, and as shown in fig. 8, the apparatus includes:
and an obtaining module 82, configured to obtain the crack map and the target model.
And a cutting module 84, configured to cut the target model based on the crack map to obtain a fragment model.
And the sampling module 86 is configured to perform texture sampling on the target model and the fragment model respectively to obtain a first texture coordinate corresponding to the target model and a second texture coordinate corresponding to the fragment model.
A generating module 88 for generating a three-dimensional fracture animation based on the crack map, the first texture coordinates, and the second texture coordinates.
In at least some embodiments of the invention, after a crack map and a target model are obtained, the target model is cut based on the crack map to obtain a three-dimensional fragment model, so that a three-dimensional fragmentation animation can be more vivid, then the target model and the fragment model are sampled to obtain a first texture coordinate and a second texture coordinate, and finally, rendering is performed based on the crack map, the first texture coordinate and the second texture coordinate to obtain a three-dimensional fragmentation animation with a more stereoscopic sense, so that the technical effect of improving user experience is achieved, and the technical problem of poor user experience caused by poor fragmentation effect display effect in the prior art is solved.
Optionally, the generating module includes: the sampling unit is used for sampling the texture of the crack map by using the first texture coordinate to obtain the map color of the crack map; and the animation processing unit is used for performing vertex animation processing on the fragment model based on the chartlet color and the second texture coordinate to obtain the three-dimensional fragmentation animation.
Optionally, the animation processing unit includes: the acquisition subunit is used for acquiring a control offset degree, wherein the control offset degree is used for controlling the offset degree of the fragment model; and the animation processing subunit is used for performing vertex animation processing on the fragment model based on the mapping color, the second texture coordinate and the control offset degree to obtain the three-dimensional fragmentation animation.
Optionally, the animation processing subunit is further configured to perform texture sampling on the crack map by using the first texture coordinate to obtain a first channel color, where the first channel color is a color of a first preset channel of the crack map; texture sampling is carried out on the preset ablation map by using a second texture coordinate to obtain a second channel color, wherein the second channel color is the color of a second preset channel of the preset ablation map; processing the color of the second channel by using an ablation function to generate an ablation result; obtaining the product of the ablation result and the first channel color to obtain a first target color; and performing vertex animation processing on the fragment model based on the chartlet color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
Optionally, the animation processing subunit is further configured to perform texture sampling on the preset refraction map based on the first texture coordinate, so as to obtain a third channel color, where the third channel color is a color of a third preset channel of the preset refraction map; obtaining the product of the ablation result and the third channel color to obtain a second target color; and performing vertex animation processing on the fragment model based on the chartlet color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
Optionally, the obtaining module includes: the determining unit is used for determining a target fragment with a preset shape in the initial mapping; and the illumination unit is used for carrying out illumination treatment on a part of the region where the target fragment is located to obtain a crack mapping.
Optionally, the illumination unit comprises: the determining subunit is used for determining a target fragment with a preset shape in the initial mapping; and the processing subunit is used for carrying out illumination processing on a part of the region where the target fragment is located to obtain a crack map.
Embodiments of the present invention also provide a non-volatile storage medium having a computer program stored therein, wherein the computer program is configured to perform the steps of any of the above method embodiments when executed.
Optionally, in this embodiment, the nonvolatile storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Optionally, in this embodiment, the nonvolatile storage medium may be located in any one of computer terminals in a computer terminal group in a computer network, or in any one of mobile terminals in a mobile terminal group.
Alternatively, in the present embodiment, the above-mentioned nonvolatile storage medium may be configured to store a computer program for executing the steps of: performing texture sampling on the crack mapping by using the first texture coordinate to obtain mapping color of the crack mapping; and performing vertex animation processing on the fragment model based on the chartlet color and the second texture coordinate to obtain the three-dimensional fragmentation animation.
Alternatively, in this embodiment, the above-mentioned nonvolatile storage medium may be configured to store a computer program for executing the steps of: acquiring a control offset degree, wherein the control offset degree is used for controlling the offset degree of the fragment model; and performing vertex animation processing on the fragment model based on the mapping color, the second texture coordinate and the control offset degree to obtain the three-dimensional fragmentation animation.
Alternatively, in the present embodiment, the above-mentioned nonvolatile storage medium may be configured to store a computer program for executing the steps of: performing texture sampling on the crack map by using the first texture coordinate to obtain a first channel color, wherein the first channel color is the color of a first preset channel of the crack map; texture sampling is carried out on the preset ablation map by using a second texture coordinate to obtain a second channel color, wherein the second channel color is the color of a second preset channel of the preset ablation map; processing the color of the second channel by using an ablation function to generate an ablation result; obtaining the product of the ablation result and the first channel color to obtain a first target color; and performing vertex animation processing on the fragment model based on the chartlet color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
Alternatively, in this embodiment, the above-mentioned nonvolatile storage medium may be configured to store a computer program for executing the steps of: performing texture sampling on the preset refraction mapping based on the first texture coordinate to obtain a third channel color, wherein the third channel color is the color of a third preset channel of the preset refraction mapping; obtaining the product of the ablation result and the third channel color to obtain a second target color; and performing vertex animation processing on the fragment model based on the chartlet color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
Alternatively, in the present embodiment, the above-mentioned nonvolatile storage medium may be configured to store a computer program for executing the steps of: constructing cracks in different directions to generate an initial mapping; and carrying out illumination treatment on the initial map to obtain the crack map.
Alternatively, in the present embodiment, the above-mentioned nonvolatile storage medium may be configured to store a computer program for executing the steps of: determining a target fragment with a preset shape in the initial mapping; and carrying out illumination treatment on partial area of the area where the target fragment is located to obtain a crack map.
Optionally, in this embodiment, the nonvolatile storage medium may include but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
In the nonvolatile storage medium of this embodiment, after the crack map and the target model are obtained, the target model is cut based on the crack map to obtain a three-dimensional fragment model, so that the three-dimensional fractured animation is more vivid, then, the target model and the fragment model are sampled to obtain a first texture coordinate and a second texture coordinate, and finally, the three-dimensional fractured animation with more stereoscopic impression is obtained by rendering based on the crack map, the first texture coordinate and the second texture coordinate, so that a technical effect of improving user experience is achieved, and the technical problem that the user experience is poor due to poor display effect of the fractured effect in the prior art is solved.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present invention can be embodied in the form of a software product, which can be stored in a computer-readable storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (which can be a personal computer, a server, a terminal device, or a network device, etc.) execute the method according to the embodiment of the present invention.
In an exemplary embodiment of the present invention, a program product capable of implementing the above-described method of the present embodiment is stored on a computer-readable storage medium. In some possible implementations, various aspects of the embodiments of the present invention may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps according to various exemplary implementations of the present invention described in the above section "exemplary method" of this embodiment, when the program product is run on the terminal device.
According to the program product for realizing the method, the portable compact disc read only memory (CD-ROM) can be adopted, the program code is included, and the program product can be operated on terminal equipment, such as a personal computer. However, the program product of the embodiments of the invention is not limited thereto, and in the embodiments of the invention, the computer readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product described above may employ any combination of one or more computer-readable media. The computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that the program code embodied on the computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, obtaining a crack map and a target model;
s2, cutting the target model based on the crack mapping to obtain a fragment model;
s3, texture sampling is respectively carried out on the target model and the fragment model, and a first texture coordinate corresponding to the target model and a second texture coordinate corresponding to the fragment model are obtained;
and S4, generating a three-dimensional fragmentation animation based on the crack map, the first texture coordinate and the second texture coordinate.
Optionally, the processor may be further configured to execute the following steps by a computer program: performing texture sampling on the crack mapping by using the first texture coordinate to obtain mapping color of the crack mapping; and performing vertex animation processing on the fragment model based on the chartlet color and the second texture coordinate to obtain the three-dimensional fragmentation animation.
Optionally, the processor may be further configured to execute the following steps by a computer program: acquiring a control offset degree, wherein the control offset degree is used for controlling the offset degree of the fragment model; and performing vertex animation processing on the fragment model based on the mapping color, the second texture coordinate and the control offset degree to obtain the three-dimensional fragmentation animation.
Optionally, the processor may be further configured to execute the following steps by a computer program: performing texture sampling on the crack map by using the first texture coordinate to obtain a first channel color, wherein the first channel color is the color of a first preset channel of the crack map; texture sampling is carried out on the preset ablation map by using a second texture coordinate to obtain a second channel color, wherein the second channel color is the color of a second preset channel of the preset ablation map; processing the color of the second channel by using an ablation function to generate an ablation result; obtaining the product of the ablation result and the first channel color to obtain a first target color; and performing vertex animation processing on the fragment model based on the chartlet color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
Optionally, the processor may be further configured to execute the following steps by a computer program: performing texture sampling on the preset refraction mapping based on the first texture coordinate to obtain a third channel color, wherein the third channel color is the color of a third preset channel of the preset refraction mapping; obtaining the product of the ablation result and the third channel color to obtain a second target color; and performing vertex animation processing on the fragment model based on the chartlet color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
Optionally, the processor may be further configured to execute the following steps by a computer program: constructing cracks in different directions to generate an initial mapping; and carrying out illumination treatment on the initial map to obtain the crack map.
Optionally, the processor may be further configured to execute the following steps by a computer program: determining a target fragment with a preset shape in the initial mapping; and carrying out illumination treatment on partial area of the area where the target fragment is located to obtain a crack map.
In the electronic device of the embodiment, after the crack map and the target model are obtained, the target model is cut based on the crack map to obtain a three-dimensional fragment model, so that the three-dimensional fragmentation animation is more vivid, then, the target model and the fragment model are sampled to obtain a first texture coordinate and a second texture coordinate, and finally, the three-dimensional fragmentation animation with more stereoscopic impression is obtained by rendering based on the crack map, the first texture coordinate and the second texture coordinate, so that the technical effect of improving the user experience is achieved, and the technical problem that the user experience is poor due to the poor display effect of the fragmentation effect in the prior art is solved.
Fig. 9 is a schematic diagram of an electronic device according to an embodiment of the invention. As shown in fig. 9, the electronic device 900 is only an example and should not bring any limitation to the function and the scope of the application of the embodiment of the present invention.
As shown in fig. 9, the electronic apparatus 900 is embodied in the form of a general purpose computing device. The components of electronic device 900 may include, but are not limited to: the at least one processor 910, the at least one memory 920, the bus 930 connecting the various system components (including the memory 920 and the processor 910), and the display 940.
Wherein the above-mentioned memory 920 stores program code, which can be executed by the processor 910, to cause the processor 910 to perform the steps according to various exemplary embodiments of the present invention described in the above-mentioned method section of the embodiments of the present invention.
The memory 920 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM) 9201 and/or a cache memory unit 9202, may further include a read only memory unit (ROM) 9203, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory.
In some examples, memory 920 can also include program/utility 9204 having a set (at least one) of program modules 9205, such program modules 9205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment. The memory 920 may further include memory located remotely from the processor 910 and such remote memory may be coupled to the electronic device 900 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Bus 930 can be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, and processor 910 or a local bus using any of a variety of bus architectures.
Display 940 may, for example, be a touch screen type Liquid Crystal Display (LCD) that may enable a user to interact with a user interface of electronic device 900.
Optionally, the electronic apparatus 900 may also communicate with one or more external devices 1000 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic apparatus 900, and/or with any devices (e.g., router, modem, etc.) that enable the electronic apparatus 900 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 950. Also, the electronic device 900 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 960. As shown in fig. 9, the network adapter 960 communicates with the other modules of the electronic device 900 via the bus 930. It should be appreciated that although not shown in FIG. 9, other hardware and/or software modules may be used in conjunction with electronic device 900, which may include but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The electronic device 900 may further include: a keyboard, a cursor control device (e.g., a mouse), an input/output interface (I/O interface), a network interface, a power source, and/or a camera.
It will be understood by those skilled in the art that the structure shown in fig. 9 is only an illustration and is not intended to limit the structure of the electronic device. For example, electronic device 900 may also include more or fewer components than shown in FIG. 9, or have a different configuration than shown in FIG. 1. The memory 920 may be used to store a computer program and corresponding data, such as a computer program and corresponding data corresponding to the method for generating a three-dimensional fragmentation animation according to an embodiment of the present invention. The processor 910 executes various functional applications and data processing by running a computer program stored in the memory 920, that is, implements the above-described three-dimensional fragmentation animation generation method.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present invention, it should be understood that the disclosed technical contents can be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A method for generating a fragmentation animation, comprising:
acquiring a crack mapping and a target model;
cutting the target model based on the crack mapping to obtain a fragment model;
texture sampling is respectively carried out on the target model and the fragment model, and a first texture coordinate corresponding to the target model and a second texture coordinate corresponding to the fragment model are obtained;
generating a three-dimensional fracture animation based on the crack map, the first texture coordinates, and the second texture coordinates.
2. The method of claim 1, wherein generating a three-dimensional fragmentation animation based on the crack map, the first texture coordinates, and the second texture coordinates comprises:
performing texture sampling on the crack mapping by using the first texture coordinate to obtain mapping color of the crack mapping;
and performing vertex animation processing on the fragment model based on the chartlet color and the second texture coordinate to obtain the three-dimensional fragmentation animation.
3. The method of claim 2, wherein performing vertex animation processing on the fragment model based on the chartlet color and the second texture coordinate to obtain the three-dimensional fragmentation animation comprises:
acquiring a control offset degree, wherein the control offset degree is used for controlling the offset degree of the debris model;
and performing vertex animation processing on the fragment model based on the chartlet color, the second texture coordinate and the control offset degree to obtain the three-dimensional fragmentation animation.
4. The method of claim 3, wherein performing vertex animation processing on the fragment model based on the map color, the second texture coordinate, and the degree of offset to obtain the three-dimensional fragmentation animation comprises:
performing texture sampling on the crack map by using the first texture coordinate to obtain a first channel color, wherein the first channel color is the color of a first preset channel of the crack map;
texture sampling is carried out on a preset ablation map by utilizing the second texture coordinates to obtain a second channel color, wherein the second channel color is the color of a second preset channel of the preset ablation map;
processing the second channel color by using an ablation function to generate an ablation result;
obtaining the product of the ablation result and the first channel color to obtain a first target color;
and performing vertex animation processing on the fragment model based on the chartlet color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
5. The method of claim 4, wherein vertex animating the fragment model based on the map color, the first target color, the second texture coordinate, and the degree of offset to obtain the three-dimensional fragmentation animation comprises:
performing texture sampling on a preset refraction mapping based on the first texture coordinate to obtain a third channel color, wherein the third channel color is the color of a third preset channel of the preset refraction mapping;
obtaining a product of the ablation result and the color of the third channel to obtain a second target color;
and performing vertex animation processing on the fragment model based on the map color, the first target color, the second texture coordinate and the offset degree to obtain the three-dimensional fragmentation animation.
6. The method of claim 1, wherein generating the crack map for the three-dimensional fragmentation animation comprises:
constructing cracks in different directions to generate an initial mapping;
and carrying out illumination treatment on the initial mapping to obtain the crack mapping.
7. The method of claim 6, wherein illuminating the initial map to obtain the crack map comprises:
determining a target fragment with a preset shape in the initial mapping;
and carrying out illumination treatment on partial area in the area where the target fragment is located to obtain the crack mapping.
8. A fragmentation animation generation device, comprising:
the acquisition module is used for acquiring a crack map and a target model;
the cutting module is used for cutting the target model based on the crack mapping to obtain a fragment model;
the sampling module is used for respectively carrying out texture sampling on the target model and the fragment model to obtain a first texture coordinate corresponding to the target model and a second texture coordinate corresponding to the fragment model;
a generation module for generating a three-dimensional fracture animation based on the crack map, the first texture coordinates, and the second texture coordinates.
9. A non-volatile storage medium, in which a computer program is stored, wherein the computer program is arranged to be executed by a processor to perform the method of generating a fragmentation animation as claimed in any one of claims 1 to 7.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the fragmentation animation generation method of any of claims 1 to 7.
CN202211427993.1A 2022-11-15 2022-11-15 Method and device for generating fragmentation animation and storage medium Pending CN115713586A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211427993.1A CN115713586A (en) 2022-11-15 2022-11-15 Method and device for generating fragmentation animation and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211427993.1A CN115713586A (en) 2022-11-15 2022-11-15 Method and device for generating fragmentation animation and storage medium

Publications (1)

Publication Number Publication Date
CN115713586A true CN115713586A (en) 2023-02-24

Family

ID=85233382

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211427993.1A Pending CN115713586A (en) 2022-11-15 2022-11-15 Method and device for generating fragmentation animation and storage medium

Country Status (1)

Country Link
CN (1) CN115713586A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091659A (en) * 2023-04-07 2023-05-09 深圳市趣推科技有限公司 Method, device, equipment and medium for simulating image fragmentation caused by stress

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091659A (en) * 2023-04-07 2023-05-09 深圳市趣推科技有限公司 Method, device, equipment and medium for simulating image fragmentation caused by stress

Similar Documents

Publication Publication Date Title
CN111145326A (en) Processing method of three-dimensional virtual cloud model, storage medium, processor and electronic device
CN110148203B (en) Method and device for generating virtual building model in game, processor and terminal
CN109725956B (en) Scene rendering method and related device
WO2016040153A1 (en) Environmentally mapped virtualization mechanism
CN115713586A (en) Method and device for generating fragmentation animation and storage medium
CN115375822A (en) Cloud model rendering method and device, storage medium and electronic device
US20230405452A1 (en) Method for controlling game display, non-transitory computer-readable storage medium and electronic device
CN115063518A (en) Track rendering method and device, electronic equipment and storage medium
CN115115814A (en) Information processing method, information processing apparatus, readable storage medium, and electronic apparatus
CN115888085A (en) Game information processing method, device and storage medium
CN114283230A (en) Vegetation model rendering method and device, readable storage medium and electronic device
CN115375813A (en) Rendering method and device of virtual model, storage medium and electronic device
CN114504808A (en) Information processing method, information processing apparatus, storage medium, processor, and electronic apparatus
CN115382208A (en) Three-dimensional guide map generation method, device, storage medium and electronic device
CN114742970A (en) Processing method of virtual three-dimensional model, nonvolatile storage medium and electronic device
CN114816457A (en) Method, device, storage medium and electronic device for cloning virtual model
CN114299203A (en) Processing method and device of virtual model
CN108154556A (en) A kind of virtual trailing of terminal and system
CN104951223B (en) A kind of touch screen realizes the method, apparatus and host of magnifying glass
CN109697001B (en) Interactive interface display method and device, storage medium and electronic device
CN115089964A (en) Method and device for rendering virtual fog model, storage medium and electronic device
CN116258839A (en) Image processing method, image processing device, storage medium and electronic device
CN114419214A (en) Method and device for switching weather types in game, storage medium and electronic device
CN115779415A (en) Image processing method, image processing device, storage medium and electronic equipment
CN114299207A (en) Virtual object rendering method and device, readable storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination