CN113111035B - Special effect video generation method and equipment - Google Patents

Special effect video generation method and equipment Download PDF

Info

Publication number
CN113111035B
CN113111035B CN202110382066.1A CN202110382066A CN113111035B CN 113111035 B CN113111035 B CN 113111035B CN 202110382066 A CN202110382066 A CN 202110382066A CN 113111035 B CN113111035 B CN 113111035B
Authority
CN
China
Prior art keywords
target file
size
pixel point
effect video
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110382066.1A
Other languages
Chinese (zh)
Other versions
CN113111035A (en
Inventor
杨华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Zhangmen Science and Technology Co Ltd
Original Assignee
Shanghai Zhangmen Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Zhangmen Science and Technology Co Ltd filed Critical Shanghai Zhangmen Science and Technology Co Ltd
Priority to CN202110382066.1A priority Critical patent/CN113111035B/en
Publication of CN113111035A publication Critical patent/CN113111035A/en
Application granted granted Critical
Publication of CN113111035B publication Critical patent/CN113111035B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • G06F16/162Delete operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • G06F16/168Details of user interfaces specifically adapted to file systems, e.g. browsing and visualisation, 2d or 3d GUIs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The embodiment of the application discloses a special effect video generation method and equipment. One embodiment of the method comprises: the method comprises the steps of obtaining current display content of a target file, generating a bitmap of the current display content, storing pixel point values of pixel points in the bitmap into a collision data set, setting the size of the pixel points to change along with time within a preset time period, generating rendering images based on the collision data set, and generating a smashing special effect video of the target file according to a plurality of rendering images which are continuous in generation time. According to the embodiment, the smashing special effect can be generated based on the current display content in the target file, so that the smashing special effect video can be played when the target file is smashed in the follow-up process, a user can be helped to know the smashing state of the file, the interactivity of the file smashing process can be enhanced, and the user experience is enhanced.

Description

Special effect video generation method and equipment
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a special effect video generation method and equipment.
Background
When a user uses a terminal device such as a computer or a mobile phone, the content such as an application and a file stored in the terminal device is often deleted in order to use the storage space of the terminal device more reasonably.
In the prior art, when a large application or text needing a certain deletion time is deleted, only a progress bar for deleting work is presented to a user independently, and the display content is less and too monotonous.
Disclosure of Invention
The embodiment of the application provides a method and equipment for generating a special effect video.
In a first aspect, an embodiment of the present application provides a special effect video generation method, including: acquiring the current display content of a target file, and generating a bitmap of the current display content; storing the values of the pixel points of the bitmap into a collision data set, and setting the size of the pixel points to change along with the time within a preset time period; wherein, the pixel point value at least comprises size information; and generating rendering images based on the collision data set, and generating a smashed special effect video of the target file according to a plurality of rendering images which are generated continuously in time.
In some embodiments, storing the values of the pixels in the bitmap into a collision dataset comprises: splitting the pixel points in the bitmap into a plurality of sub-pixel points with preset number; and respectively storing the pixel values of the sub-pixels into the collision data set.
In some embodiments, setting that the size of the pixel point changes with time within a preset time period includes: and in response to the fact that the pixel point value of at least one historical pixel point already exists in the collision data set when the pixel point value is stored, taking the size change ending time of the historical pixel point as the size change starting time of the pixel point value.
In some embodiments, taking the size change ending time of the history pixel point as the size change starting time of the value of the pixel point includes: and taking the change ending time of the pixel point value of the historical pixel point with the shortest time from the size change ending time to the pixel point value as the size change starting time of the pixel point value.
In some embodiments, the method further comprises: and updating the preset time period in response to the fact that the number of the pixel points which change in size in the same preset time period in the collision data set meets the preset requirement.
In some embodiments, storing a value of a pixel point in the bitmap into a collision data set, and setting a size of the pixel point to change with time within a preset time period, includes: storing the pixel value of the pixel point in the bitmap into a collision data set, and acquiring the size of the pixel point; and determining the change speed according to the size, and setting that the size of the pixel point changes along with time within a preset time period according to the change speed.
In a second aspect, an embodiment of the present application provides a special effect video playing method, including: in response to receiving a crushing instruction sent aiming at a target file, file crushing operation is carried out on the target file, and a special crushing effect video corresponding to the target file is played; the special effect video is obtained by the special effect video generation method provided by any one of the first aspect.
In some embodiments, playing the special effect video corresponding to the target file includes: jumping to a display interface where the starting identifier of the target file is located, and reducing the transparency of the starting identifier; and playing the smashed special-effect video corresponding to the target file on a lower-layer interface of a display interface where the starting identifier of the target file is located.
In some embodiments, the method further comprises: updating the current display content according to the deletion progress of the target file; obtaining an updated and smashed special effect video based on the updated current display content; and playing the smashed special-effect video corresponding to the target file, wherein the steps of: and playing the updated special-effect video with the smashing effect.
In a third aspect, an embodiment of the present application provides an apparatus for generating a special effect video, including: a bitmap generation unit configured to acquire a current display content of a target file, generate a bitmap of the current display content; a data storage unit configured to store a pixel value of a pixel point in the bitmap into a collision data set, and set that a size of the pixel point changes with time within a preset time period; a special effect video generation unit configured to generate a rendering image based on the collision data set, and generate a crushed special effect video of the target file from a plurality of rendering images whose generation times are continuous.
In some embodiments, the data logging unit comprises: a pixel point splitting unit configured to split a pixel point in the bitmap into a plurality of sub-pixel points of a preset number; a data storage subunit configured to store pixel values of the respective sub-pixels into the collision data set, respectively.
In some embodiments, the data logging unit comprises: a data storage subunit configured to store pixel values of pixels in the bitmap into a collision dataset; and the pixel point change setting subunit is configured to respond to the storage of the pixel point value of at least one historical pixel point in the collision data set when the pixel point value is stored, and take the size change ending time of the historical pixel point as the size change starting time of the pixel point value.
In some embodiments, the pixel change setting subunit is further configured to, in response to a pixel value of at least one historical pixel already existing in the collision data set when the pixel value is stored, take a change ending time of a pixel value of a historical pixel whose size change ending time is closest to the time at which the pixel value is stored as a size change starting time of the pixel value.
In some embodiments, the apparatus further comprises: and the change time updating unit is configured to respond to the condition that the number of the pixel points which are subjected to size change in the same preset time period in the collision data set meets a preset requirement, and update the preset time period.
In some embodiments, the data storage unit is further configured to store a pixel value of a pixel in the bitmap into the collision data set, and obtain a size of the pixel; and determining the change speed according to the size, and setting the size of the pixel point in a preset time period to change along with time according to the change speed.
In a fourth aspect, an embodiment of the present application provides a special effect video generating apparatus, including: a special effect video playing unit configured to execute a file shredding operation on a target file and play a shredded special effect video corresponding to the target file in response to receiving a shredding instruction sent for the target file; wherein the crushed special effect video is generated by the special effect video apparatus provided in any one of the third aspects.
In some embodiments, the special effect video playback unit includes: the target file adjusting subunit is configured to jump to a display interface where the starting identifier of the target file is located, and reduce the transparency of the starting identifier; and the special effect video playing subunit is configured to play the crushed special effect video corresponding to the target file on a lower interface of the display interface where the starting identifier of the target file is located.
In some embodiments, the apparatus further comprises: a display content updating unit configured to update the current display content according to the deletion progress of the target file; obtaining an updated and smashed special effect video based on the updated current display content; and the special effect video playing unit is further configured to play the updated shredded special effect video.
In a fifth aspect, an embodiment of the present application provides a computer device, including: one or more processors; a storage device having one or more programs stored thereon; when executed by one or more processors, the one or more programs cause the one or more processors to implement the special effects video generation method as described in any implementation manner of the first aspect and/or the special effects video playing method as described in any implementation manner of the third aspect.
In a sixth aspect, the present application provides a computer-readable medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the special effects video generation method described in any implementation manner of the first aspect and/or the special effects video playing method described in any implementation manner of the third aspect.
The method and the device for generating the special effect video, provided by the embodiment of the application, are used for obtaining current display content of a target file, generating a bitmap of the current display content, storing pixel values of pixels in the bitmap into a collision data set, setting the size of the pixels to change along with time within a preset time period, generating rendering images based on the collision data set, and generating a smashed special effect video of the target file according to a plurality of rendering images which are generated continuously in time. According to the embodiment, the smashing special effect can be generated based on the current display content in the target file, so that the smashing special effect video can be played when the target file is smashed in the follow-up process, a user can be helped to know the smashing state of the file, the interactivity of the file smashing process can be enhanced, and the user experience is enhanced.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture to which some embodiments of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a special effects video generation method according to the present application;
FIG. 3 is a flow diagram of one embodiment of a special effects video playback method according to the present application;
FIGS. 4-1 and 4-2 are diagrams illustrating effects of application scenarios of the special effect video generation method according to the present application;
FIG. 5 is a schematic block diagram of a computer system suitable for use with the computer device of some embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that, in the present application, the embodiments and features of the embodiments may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the special effects video generation methods of the present application may be applied.
As shown in fig. 1, devices 101, 102, 103, 104 and network 105 may be included in system architecture 100. The network 105 is the medium by which communication links are provided between the devices 101, 102, 103, 104. Network 105 may include various connection types, such as wired, wireless target communication links, or fiber optic cables, to name a few.
The devices 101, 102, 103, 104 may be hardware devices or software that support network connections to provide various network services. When the device is hardware, it can be a variety of electronic devices including, but not limited to, smart phones, tablets, laptop portable computers, desktop computers, servers, and the like. In this case, the hardware device may be implemented as a distributed device group including a plurality of devices, or may be implemented as a single device. When the device is software, it can be installed in the electronic devices listed above. At this time, as software, it may be implemented as a plurality of software or software modules for providing a distributed service, for example, or as a single software or software module. And is not particularly limited herein.
In practice, a device may provide a corresponding network service by installing a corresponding client application or server application. After the device has installed the client application, it may be embodied as a client in network communications. Accordingly, after the server application is installed, it may be embodied as a server in network communications.
As an example, in fig. 1, the devices 101, 102, 103 are embodied as terminals and the device 104 is embodied as a server. Specifically, the devices 101, 102, 103 may be clients actually installed with the target file, and 104 may be clients that provide the shredded special effects video for the target file.
In practice, because the special effect video generating method according to the current display content of the target file occupies little computing resources and has a relatively strong computing capability, and in order to consider the convenience of a user in playing the crushed special effect video when the target file is crushed subsequently, the special effect video generating method provided in the subsequent embodiments of the present application is generally executed by the devices 101, 102, and 103 in which the target file is installed, and accordingly, the special effect video generating device is generally also arranged in the devices 101, 102, and 103. In such a case, the exemplary system architecture 100 may also not include the device 104 and the network 105 embodied as servers.
However, it should be noted that, for example, in a case where the computing capabilities of the devices 101, 102, and 103 are insufficient, or a case where it is preferable to execute the special effect video generating operation at the device 104 embodied as a server (for example, when a service provider provides a special effect video crushing service for a user, and when the computing resources required for generating the special effect video crushing service are large, etc.), the device 104 may also complete the above-mentioned operations performed by the devices 101, 102, and 103 through the corresponding applications installed thereon, and then output the same results as those of the devices 101, 102, and 103. Especially, under the condition that a plurality of terminal devices with different computing capabilities exist at the same time, but the corresponding application judges that the device 104 has stronger computing capability and more computing resources are left, so that when the special effect video is generated more efficiently, the device 104 can execute the above operations, thereby appropriately reducing the computing pressure of the devices 101, 102 and 103, and correspondingly, the special effect video generating device can also be arranged in the device 104.
It should be understood that the number of networks and devices in fig. 1 is merely illustrative. There may be any number of networks and devices, as desired for an implementation.
Referring to fig. 2, fig. 2 is a flowchart of a special effect video generation according to an embodiment of the present application, where the process 200 includes the following steps:
step 201, obtaining the current display content of the target file, and generating a bitmap of the current display content.
In this embodiment, an execution subject of special effect video generation (for example, the devices 101, 102, 103 embodied as terminals shown in fig. 1) acquires current display content of a target file, and generates a corresponding Bitmap (Bitmap), which is also called a Raster map (Raster map) or a Bitmap, based on the current display content, which is an image represented by a Pixel array (Pixel-array/Dot-matrix lattice).
When the target file is closed, the currently displayed content may be a display icon and a corresponding file name of the target file, or may be the displayed content of a specific page in the target file, for example, a first page after the target file is opened, a tab page of the target file specified by the user, or the like.
It should be noted that the currently displayed content of the target file may be directly obtained by the execution main body from a local storage device, and the local storage device may be a data storage module, such as a server hard disk, disposed in the execution main body, in which case the currently displayed content of the target file is locally and quickly read.
In addition, when the execution subject performs special effect video generation for another device, the current display content of the target file may be acquired from a non-local storage device (for example, the device 104 shown in fig. 1), and the non-local storage device may also be any other electronic device configured to store data, for example, some user terminals, in which case the execution subject may acquire the current display content of the required target file by sending an acquisition command to the electronic device.
In practice, in order to better reflect the effect of the current display content of the target file, the size of the bitmap is preferably the same as the definition and size of the current display content, so that the subsequently generated special effect video content is opposite to the current display content to present a better effect.
Step 202, storing the values of the pixel points of the bitmap into a collision data set, and setting the size of the pixel points to change with time within a preset time period.
In this embodiment, after the bitmap is obtained based on the step 201, a pixel value of each pixel point in the bitmap is stored in a collision data set (Crashdata), where the pixel value at least includes size information, and each component element in the collision data set records the size information and color information of the pixel point, and after the pixel value is stored in the collision data set, the size information in the change of the pixel value is correspondingly set to change with time in a preset time period.
The length of the preset time period generally corresponds to the deleting process of the target file, that is, the time when the desired pixel point starts to act is taken as the beginning, the time when the target file is deleted is taken as the end, and the corresponding preset time period is determined.
In some embodiments, the color information in the pixel value can be synchronously set to change along with the size in a preset time period, so as to further enrich the diversity of the special effect video.
Illustratively, when the expression of the collision data set is formed into a matrix, each data in the matrix of the collision data set corresponds to a value unit of a pixel point, a color value is provided in the value unit, and the x coordinate and the y coordinate of the pixel point in the bitmap form a matrix based on the value unit of each pixel point in the bitmap, so as to obtain the collision data set.
And 203, generating rendering images based on the collision data set, and generating the smashed special effect video of the target file according to a plurality of rendering images with continuous generation time.
In this embodiment, after the collision data set is obtained based on the step 202, since the size of the pixel point recorded in the collision data set changes with time in a preset time period, the corresponding image is generated by rendering according to the pixel point value recorded in the collision data set, the obtained images are arranged according to the generation time to obtain an image sequence, and a special-effect video for pulverization formed based on the change in the size of the pixel point is generated according to the image sequence.
The method for generating the special effect video comprises the steps of obtaining current display content of a target file, generating a bitmap of the current display content, storing pixel point values of pixel points in the bitmap into a collision data set, setting the size of the pixel points to change along with time within a preset time period, generating rendering images based on the collision data set, and generating a smashed special effect video of the target file according to a plurality of rendering images which are continuous in generation time. According to the embodiment, the smashing special effect can be generated based on the current display content in the target file, so that the smashing special effect video can be played when the target file is smashed in the follow-up process, a user can be helped to know the smashing state of the file, the interactivity of the file smashing process can be enhanced, and the user experience is enhanced.
In some embodiments, storing the values of the pixels in the bitmap into the collision dataset comprises: splitting the pixel points in the bitmap into a plurality of sub-pixel points with preset number; and respectively storing the pixel values of the sub-pixels into the collision data set.
Specifically, when the pixel points in the bitmap are stored in the collision data set, the pixel points in the bitmap can be further split into a plurality of sub-pixel points in a preset number, the shape of each sub-pixel point can be determined to be a circle corresponding to the pixel point, the sub-pixel points can also be determined to be a diamond shape, a triangle shape and the like according to actual requirements, and further pixel point values of all the sub-pixel points are stored in the collision data set respectively, so that the special effect fine granularity in the special effect video is promoted to be smashed.
In some embodiments, setting that the size of the pixel point changes with time within a preset time period includes: and in response to the fact that at least one pixel point dereferencing of the historical pixel point exists in the collision data set when the pixel point dereferencing is stored, taking the size change ending time of the historical pixel point as the size change starting time of the pixel point dereferencing.
Specifically, when the pixel value of the pixel is stored in the collision data set, the pixel value of the historical pixel which is stored in history already exists in the just-collided data set, the time of ending the change of the size of the historical pixel value along with the change of time is obtained, the ending time is used as the starting time of the change of the size of the pixel value stored at this time, the change of the size of the pixel value stored at this time is continued when the change of the size of the historical pixel is finished, and therefore the dynamic special effect is achieved.
In some embodiments, taking the size change ending time of the history pixel point as the size change starting time of the value of the pixel point includes: and taking the change ending time of the pixel point value of the historical pixel point with the shortest time from the size change ending time to the pixel point value as the size change starting time of the pixel point value.
Specifically, the time of storing the pixel values of the historical pixels into the collision data set is judged, and the time of finishing the change of the size of the historical pixel value with the time after the latest storage time is taken as the initial time of the change of the size of the currently stored pixel value, so that the continuity of each part in the special effect is further improved.
In some embodiments, the method for generating a special effect video further includes: and updating the preset time period in response to the fact that the number of the pixel points which change in size in the same preset time period in the collision data set meets the preset requirement.
Specifically, when the number of the pixels which change in size based on the same preset time period and are set in the collision data set meets the preset requirement, the preset time period is updated, and the size of the subsequent pixels is correspondingly set in the updated preset time period, so that the size of the pixels is set in a gradient manner, and more types of special effects are formed.
In some embodiments, storing a value of a pixel point in the bitmap into a collision data set, and setting a size of the pixel point to change with time within a preset time period, includes: storing the pixel value of the pixel point in the bitmap into a collision data set, and acquiring the size of the pixel point; and determining the change speed according to the size, and setting the size of the pixel point in a preset time period to change along with time according to the change speed.
Specifically, when the pixel value of the pixel in the bitmap is stored in the collision data set, the size of the pixel is obtained, the corresponding change speed is determined according to the size, the size of the pixel is changed along with the time according to the change speed in a preset time period, the special effect adaptive to the resolution of the current display content is generated, and the visual effect is improved.
Further, on the basis of any of the above embodiments, in order to fully enable the special effect video obtained according to the special effect video generation method to function, the special effect video can also function in a file shredding scene according to the following steps:
in response to receiving a crushing instruction sent aiming at a target file, file crushing operation is carried out on the target file, and a special crushing effect video corresponding to the target file is played; wherein the shredded special effects video is generated by a special effects video generation method as provided in fig. 2.
On this basis, referring to fig. 3, a flow 300 of an embodiment of a special effect video playing method is shown, which specifically includes:
step 301, in response to receiving a shredding instruction sent for a target file, performing a file shredding operation on the target file.
In this embodiment, a device actually installed with a target file is generally used as an execution main body of the special effect video playing method (for example, the devices 101, 102, and 103 shown in fig. 1 and embodied as terminals), and after receiving a shredding instruction sent for the target file, the execution main body may perform corresponding playing from acquiring a locally generated shredded special effect video in advance, or may acquire a shredded special effect video of a required target file by sending an acquisition command to another device for generating the shredded special effect video.
Step 302, jumping to the display interface where the start identifier of the target file is located, and turning down the transparency of the start identifier.
In this embodiment, after the shredding instruction sent by the target file starts to be executed, the interface of the device that stores the target file jumps to the display interface where the start identifier of the target file is located, which may be, for example, a desktop, an application menu, and the like where the start identifier of the target file is located, and the transparency of the start identifier in the display interface is reduced.
In some embodiments, if it is explicitly specified that the target of the shredding is partial data in the target file in the shredding instruction sent for the target file, it may be correspondingly determined that the start identifier is the start identifier for calling the partial data in the target file, and the interface is correspondingly jumped to the interface in which the start identifier of the partial data is recorded in the target file.
In some embodiments, in order to better present the display effect of the special effect video, the transparency of all the activation flags included in the display interface may be further turned down.
And 303, playing the smashed special-effect video corresponding to the target file on a lower-layer interface of a display interface where the starting identifier of the target file is located.
In this embodiment, a special effect video corresponding to the target file is played on a lower interface of a display interface where the start identifier is located, so that the target file and the corresponding special effect video are simultaneously presented on the device.
The special-effect video playing method provided by the embodiment can synchronously play the crushed special-effect video corresponding to the target file when the target file is deleted, so that a user can be helped to know the crushing state of the file, the interactivity of the file crushing process can be enhanced, and the user experience is enhanced.
In some embodiments, the method further comprises: updating the current display content according to the deletion progress of the target file; obtaining an updated and smashed special effect video based on the updated current display content; and playing the smashed special-effect video corresponding to the target file, wherein the steps of: and playing the updated special-effect video with the smashing effect.
Specifically, when the execution main body executes the shredding operation of the target file and plays the shredded special effect video, the current display content may be further updated according to the deletion progress of the target file, for example, when the shredding progress reaches 50%, the display content may be clipped to 50% of the original image, the updated current display content is obtained, and the updated current display content is fed back, so that the execution main body of the special effect video generation method updates the shredded special effect video according to the updated current display content, and the shredding progress is better fed back to the shredded special effect video, so as to generate the shredded special effect video with higher dynamic reference value.
In order to deepen understanding, the application also provides a specific implementation scheme by combining a specific application scene:
1) in the application scenario, the main body of the special effect video generation method acquires the current display content (a simple line of a house) of the target file, as shown in fig. 4-1, and generates a bitmap of the current display content.
2) And storing the pixel value of the pixel point at least comprising the size information in the bitmap into a collision data set, and setting the size of the pixel point to be increased by 0.5 time (namely the size x and y coordinates of the pixel point are amplified by 0.5 time per second) within a preset time period along with the time every 1 second.
3) Generating a rendering image based on the collision data set, and generating a special effect smashing video of the target file according to a plurality of rendering images which are generated continuously in time, wherein a rendering image effect schematic diagram for generating the special effect smashing video can refer to fig. 4-2.
In the application scene, an execution main body of the special effect video generation method can generate a corresponding bitmap according to the current display content of the target file, and after pixel values of pixels in the bitmap are stored in the collision data set, elements in the collision data set are controlled to change along with time, namely, the pixels in the bitmap gradually change along with the time, so that a special effect that the pixels in the bitmap gradually expand is generated, the special effect that the target file is gradually crushed is simulated, and the final crushed special effect video is obtained, so that the crushed special effect video is played when the target file is crushed subsequently, a user can be helped to know the crushing state of the file, the interactivity of the file crushing process can be enhanced, and the user experience is enhanced.
Referring now to FIG. 5, a block diagram of a computer system 500 suitable for use in implementing a computing device (e.g., devices 101, 102, 103, 104 shown in FIG. 1) of an embodiment of the present application is shown. The computer device shown in fig. 5 is only an example, and should not bring any limitation to the function and the scope of use of the embodiments of the present application.
As shown in fig. 5, the computer system 500 includes a Central Processing Unit (CPU)501 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the system 500 are also stored. The CPU 501, ROM 502, and RAM 503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output portion 505 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 507 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 507 as necessary.
In particular, according to embodiments of the present application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program performs the above-described functions defined in the method of the present application when executed by the Central Processing Unit (CPU) 501.
It should be noted that the computer readable medium of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or electronic device. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, which may be described as: a processor includes a bitmap generation unit, a data storage unit, and a special effect video generation unit. Where the names of these cells do not constitute a limitation on the cells themselves in this case, for example, the bitmap generation unit may also be described as "acquiring the current display content of the target file, generating a bitmap of the current display content". As another example, it can be described as: a processor includes a special effects video playback unit. In this case, the name of a unit does not constitute a limitation on the unit itself, and for example, the special effect video playing unit may also be described as "performing a file shredding operation on a target file in response to receiving a shredding instruction transmitted for the target file, and playing a shredded special effect video corresponding to the target file".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the computer device described in the above embodiments; or may exist separately and not be incorporated into the computer device. The computer-readable medium carries one or more programs which, when executed by the computing device, cause the computing device to: the method comprises the steps of obtaining current display content of a target file, generating a bitmap of the current display content, storing pixel point values of pixel points in the bitmap into a collision data set, setting the size of the pixel points to change along with time within a preset time period, generating rendering images based on the collision data set, and generating a smashing special effect video of the target file according to a plurality of rendering images which are continuous in generation time.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (11)

1. A special effects video generation method, comprising:
acquiring current display content of a target file, and generating a bitmap of the current display content;
storing the values of the pixel points of the bitmap into a collision data set, and setting the size of the pixel points to change along with the time within a preset time period; wherein, the pixel point value at least comprises size information;
and generating rendering images based on the collision data set, and generating a smashed special effect video of the target file according to a plurality of rendering images which are generated continuously in time.
2. The method of claim 1, wherein storing values of pixels in the bitmap into a collision dataset comprises:
splitting the pixel points in the bitmap into a plurality of sub-pixel points with a preset number;
and respectively storing the pixel values of the sub-pixels into the collision data set.
3. The method of claim 1, wherein the setting that the size of the pixel points changes with time within a preset time period comprises:
and in response to the fact that at least one pixel point dereferencing of the historical pixel point exists in the collision data set when the pixel point dereferencing is stored, taking the size change ending time of the historical pixel point as the size change starting time of the pixel point dereferencing.
4. The method of claim 3, wherein the taking the size change ending time of the historical pixel point as the size change starting time of the pixel point value comprises:
and taking the change ending time of the pixel point value of the size change ending time distance to the pixel point value stored in the history pixel point with the latest time as the size change starting time of the pixel point value.
5. The method of claim 1, further comprising:
and updating the preset time period in response to the fact that the number of the pixel points which change in size in the same preset time period in the collision data set meets the preset requirement.
6. The method of claim 1, wherein storing pixel values of pixels in the bitmap into a collision dataset and setting a size of the pixels to change with time within a preset time period comprises:
storing the values of the pixel points in the bitmap into a collision data set, and acquiring the size of the pixel points;
and determining the change speed according to the size, and setting the size of the pixel point to change with time within a preset time period according to the change speed.
7. A special effect video playing method comprises the following steps:
in response to receiving a crushing instruction sent aiming at a target file, file crushing operation is carried out on the target file, and a crushing special-effect video corresponding to the target file is played; wherein the shredded special effects video is generated by the special effects video generation method of any one of claims 1-6.
8. The method of claim 7, wherein the playing the shredded special effects video corresponding to the target file comprises:
skipping an interface to a display interface where the starting identifier of the target file is located, and reducing the transparency of the starting identifier;
and playing the smashed special-effect video corresponding to the target file on a lower-layer interface of a display interface where the starting identifier of the target file is located.
9. The method of claim 7, further comprising:
updating the current display content according to the deletion progress of the target file;
acquiring an updated and crushed special-effect video based on the updated current display content; and
the playing of the crushed special-effect video corresponding to the target file comprises the following steps:
and playing the updated and crushed special-effect video.
10. A computer device comprising:
one or more processors;
a storage device on which one or more programs are stored;
when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-6 and/or the method of any of claims 7-9.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method of any one of claims 1 to 6 and/or the method of any one of claims 7 to 9.
CN202110382066.1A 2021-04-09 2021-04-09 Special effect video generation method and equipment Active CN113111035B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110382066.1A CN113111035B (en) 2021-04-09 2021-04-09 Special effect video generation method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110382066.1A CN113111035B (en) 2021-04-09 2021-04-09 Special effect video generation method and equipment

Publications (2)

Publication Number Publication Date
CN113111035A CN113111035A (en) 2021-07-13
CN113111035B true CN113111035B (en) 2022-09-23

Family

ID=76715231

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110382066.1A Active CN113111035B (en) 2021-04-09 2021-04-09 Special effect video generation method and equipment

Country Status (1)

Country Link
CN (1) CN113111035B (en)

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016187769A1 (en) * 2015-05-25 2016-12-01 武克易 Method and device for achieving multilayer smoke special effect in video
CN105913473A (en) * 2015-12-27 2016-08-31 乐视致新电子科技(天津)有限公司 Realization method and system of scrolling special efficacy
CN110213638B (en) * 2019-06-05 2021-10-08 北京达佳互联信息技术有限公司 Animation display method, device, terminal and storage medium
CN110084835B (en) * 2019-06-06 2020-08-21 北京字节跳动网络技术有限公司 Method and apparatus for processing video
CN111193876B (en) * 2020-01-08 2021-09-07 腾讯科技(深圳)有限公司 Method and device for adding special effect in video
CN111556363B (en) * 2020-05-21 2021-09-28 腾讯科技(深圳)有限公司 Video special effect processing method, device and equipment and computer readable storage medium
CN111935528B (en) * 2020-06-22 2022-12-16 北京百度网讯科技有限公司 Video generation method and device
CN112312161A (en) * 2020-06-29 2021-02-02 北京沃东天骏信息技术有限公司 Method and device for generating video, electronic equipment and readable storage medium
CN111833461B (en) * 2020-07-10 2022-07-01 北京字节跳动网络技术有限公司 Method and device for realizing special effect of image, electronic equipment and storage medium
CN112530021B (en) * 2020-12-24 2023-06-23 北京百度网讯科技有限公司 Method, apparatus, device and storage medium for processing data

Also Published As

Publication number Publication date
CN113111035A (en) 2021-07-13

Similar Documents

Publication Publication Date Title
CN110046021B (en) Page display method, device, system, equipment and storage medium
CN109460233B (en) Method, device, terminal equipment and medium for updating native interface display of page
CN110070496B (en) Method and device for generating image special effect and hardware device
CN110675465B (en) Method and apparatus for generating image
CN109671147B (en) Texture map generation method and device based on three-dimensional model
CN111225288A (en) Method and device for displaying subtitle information and electronic equipment
CN109168012B (en) Information processing method and device for terminal equipment
CN111258519B (en) Screen split implementation method, device, terminal and medium
CN111240769A (en) Page starting method, device, equipment and storage medium
CN109725970A (en) The method, apparatus and electronic equipment that applications client window is shown
CN110930325B (en) Image processing method and device based on artificial intelligence and storage medium
CN114863214A (en) Image generation model training method, image generation device, image generation medium, and image generation device
CN115495175A (en) Picture display method and device, terminal equipment and computer medium
CN110288523B (en) Image generation method and device
US11195248B2 (en) Method and apparatus for processing pixel data of a video frame
CN113111035B (en) Special effect video generation method and equipment
CN110647273B (en) Method, device, equipment and medium for self-defined typesetting and synthesizing long chart in application
CN115988255A (en) Special effect generation method and device, electronic equipment and storage medium
CN115576470A (en) Image processing method and apparatus, augmented reality system, and medium
CN114092362A (en) Panoramic picture loading method and device
CN116527993A (en) Video processing method, apparatus, electronic device, storage medium and program product
CN111159593A (en) Method and device for generating flow chart, storage medium and electronic equipment
CN109636724A (en) A kind of display methods of list interface, device, equipment and storage medium
CN112395826B (en) Text special effect processing method and device
WO2021018176A1 (en) Text special effect processing method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant