CN113838168A - Method for generating particle special effect animation - Google Patents

Method for generating particle special effect animation Download PDF

Info

Publication number
CN113838168A
CN113838168A CN202111191688.2A CN202111191688A CN113838168A CN 113838168 A CN113838168 A CN 113838168A CN 202111191688 A CN202111191688 A CN 202111191688A CN 113838168 A CN113838168 A CN 113838168A
Authority
CN
China
Prior art keywords
particle
time
particles
special effect
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111191688.2A
Other languages
Chinese (zh)
Other versions
CN113838168B (en
Inventor
杨健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yeelion Online Network Technology Beijing Co Ltd
Original Assignee
Yeelion Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yeelion Online Network Technology Beijing Co Ltd filed Critical Yeelion Online Network Technology Beijing Co Ltd
Priority to CN202111191688.2A priority Critical patent/CN113838168B/en
Publication of CN113838168A publication Critical patent/CN113838168A/en
Application granted granted Critical
Publication of CN113838168B publication Critical patent/CN113838168B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/2053D [Three Dimensional] animation driven by audio data

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a method for generating a particle special effect animation, which comprises the following steps: acquiring the frequency spectrum data of the audio currently played, performing default value difference fitting processing on the frequency spectrum data, and generating the particle special effect based on the result of the default value difference fitting processing. According to the method and the device, during audio playing, cool and dazzling animation based on the particle special effect is presented, the existing audio visualization technology is enriched, the audio visualization and the particle special effect are combined together, the user experience can be enhanced, and the user participation sense is improved.

Description

Method for generating particle special effect animation
Technical Field
The invention relates to the technical field of computers, in particular to a method for generating a particle special effect animation.
Background
When the existing player plays audio, a user usually does not pay attention to information displayed on a screen but focuses on auditory perception of the quality of audio information in the playing process;
with the popularization of networks and the popularization of portable devices, when audio is played by many portable devices, if the optimization of the audio information quality is emphasized, the convergence is too great, the personalization is difficult to be highlighted, and the attraction and cultivation of users are not facilitated, so that the attention degree of the audio visualization technology is continuously improved.
The particle effect is one of visual effect animations, and a rendering effect is formed by simulating the movement trajectories of a plurality of particles to simulate an abstract visual effect such as a fire, an explosion, smoke, water flow, sparks, fallen leaves, clouds, fog, snow, dust, meteor trail, or a luminous trajectory.
In the prior art, a mature technical scheme for combining audio visualization and particle special effects does not exist.
The information disclosed in this background section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a method for generating a particle special effect animation, which presents a cool and dazzling animation based on a particle special effect while playing audio, enriches the existing audio visualization technology, combines the audio visualization and the particle special effect together, can enhance the experience of a user and improve the participation of the user.
In order to achieve the above purposes, the technical scheme adopted by the invention is as follows:
a method for generating a particle special effect animation is characterized by comprising the following steps:
acquiring spectral data of audio currently being played,
a default difference fitting process is performed on the spectral data,
and generating the particle special effect based on the result of the default value difference fitting process.
On the basis of the technical scheme, when the particle special effect is generated, one pattern is selected as a reference pattern, the perimeter of the reference pattern is calculated and is equally divided into n particle spraying points,
dividing the frequency spectrum data according to frequency, respectively carrying out default value difference fitting processing on the frequency spectrum data of different frequencies,
the frequency spectrum data with different frequencies correspond to the particle ejection points one by one, and the control of the ejection height of the particle ejection points is realized.
On the basis of the technical scheme, the reference pattern is in a disc shape, and a cover pattern of the currently played audio is filled in the reference pattern.
On the basis of the technical scheme, the default halving circumference is 360 particle spraying points.
On the basis of the technical scheme, when the particle special effect is generated, a standby particle pool and a current operation particle pool are created in advance;
the particles sprayed by the particle spraying points belong to the current operating particle pool,
after the particles have completed their corresponding ejection trajectories, they are assigned to a spare particle pool,
and acquiring frequency spectrum data through monitoring, and taking the frequency spectrum data as a data source of the jetting track after denoising treatment, wherein the particles acquire the jetting track from the corresponding data source.
On the basis of the technical scheme, the particles are classified, and the classification at least divides the particles into the following according to the speed as a division standard: a first velocity particle and a second velocity particle, the first velocity being N times the second velocity;
setting a fixed value of the movement time, defaulting to 600ms,
for a first velocity particle, the total movement time along its trajectory is equal to a random number within 500ms plus a fixed value of the movement time,
for the second velocity particle, the total motion time along its trajectory is equal to a random number within 500ms plus a fixed value of motion time N times.
On the basis of the above technical solution, a farthest distance is set, the farthest distance refers to a point at which the particles are farthest away from the particle ejection point,
each particle starts at its corresponding particle ejection point,
a random number of +15 degrees is set from the starting point to give each particle a different emission angle.
On the basis of the technical scheme, when the particles are generated at the particle spraying point, the particle generation time is recorded,
the total movement time of the particles along the track is taken as the life time,
during the continuous playing period of the audio, circularly calling a frequency spectrum callback method, and completing the drawing of the particles along the tracks of the particles by a draw method of a canvas;
and when the drawing draw method is called each time, obtaining the current system time, subtracting the particle generation time from the current system time obtained by calling the drawing draw method, and calculating the moving distance of the current particle to be redrawn according to the following formula:
(the current system time-particle generation time/life time obtained by calling the drawing draw method is equal to the moving distance/the farthest distance of the current particle to be redrawn;
and after the movement distance of the current particle to be redrawn is obtained, calculating the x-axis coordinate and the y-axis coordinate of the target point to be redrawn by the current particle based on the sine and cosine of the triangle through the x-axis coordinate and the y-axis coordinate of the starting point.
On the basis of the technical scheme, when the particles are drawn, according to the existing time of the particles, when the existing time is more than or equal to 80% of the life duration, a gradual change time value lifeTime is randomly generated for the particles, and the gradual transparent fading effect of the particles in the divergence process is controlled through the gradual change time value lifeTime.
On the basis of the technical scheme, the gradual transparent fading effect of the particles in the divergence process is controlled by the gradual time value life time, and the gradual transparent fading effect is based on the following formula:
transparency alpha percent 1- (time/life duration for which particle has been present) gradient time value life time
The transparency AlphaPercent is used to change the alpha value of the current color.
The method for generating the particle special effect animation has the following beneficial effects:
when audio is played, the cool and dazzling animation based on the particle special effect is presented, the existing audio visualization technology is enriched, the audio visualization and the particle special effect are combined together, the user experience can be enhanced, and the user participation sense is promoted.
Drawings
The invention has the following drawings:
the drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
fig. 1 is a flowchart of a first embodiment of a method for generating a particle special effect animation according to the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings. The detailed description, while indicating exemplary embodiments of the invention, is given by way of illustration only, in which various details of embodiments of the invention are included to assist understanding. Accordingly, it will be appreciated by those skilled in the art that various changes and modifications may be made to the embodiments described herein without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
As shown in fig. 1, the present invention provides a method for generating a particle special effect animation, which comprises the following steps:
acquiring spectral data of audio currently being played,
a default difference fitting process is performed on the spectral data,
and generating the particle special effect based on the result of the default value difference fitting process.
The default value difference fitting processing is carried out, and the purpose is to carry out data cleaning on the frequency spectrum data so as to eliminate abnormal data such as data deletion, abnormality and redundancy and the like, so that the execution efficiency of the particle special effect algorithm is not influenced. And the default value difference fitting process can be replaced by a data normalization process or a data validity detection process according to actual needs.
On the basis of the technical scheme, when the particle special effect is generated, one pattern is selected as a reference pattern, the perimeter of the reference pattern is calculated and is equally divided into n particle spraying points,
dividing the frequency spectrum data according to frequency, respectively carrying out default value difference fitting processing on the frequency spectrum data of different frequencies,
the frequency spectrum data with different frequencies correspond to the particle ejection points one by one, and the control of the ejection height of the particle ejection points is realized.
On the basis of the technical scheme, the reference pattern is in a disc shape, and a cover pattern of the currently played audio is filled in the reference pattern.
As an alternative embodiment, the cover pattern is a cover image of a CD record, or the cover pattern is an image of a head of a singer, or the cover pattern is a cover image customized by a user.
On the basis of the technical scheme, the default halving circumference is 360 particle spraying points.
When the reference pattern is a disk shape, a particle ejection point is set at 0 to 360 degrees along a circumferential profile of the reference pattern and the particles are driven to move along an ejection trajectory.
On the basis of the technical scheme, when the particle special effect is generated, a standby particle pool and a current operation particle pool are created in advance;
the particles sprayed by the particle spraying points belong to the current operating particle pool,
after the particles have completed their corresponding ejection trajectories, they are assigned to a spare particle pool,
and acquiring frequency spectrum data through monitoring, and taking the frequency spectrum data as a data source of the jetting track after denoising treatment, wherein the particles acquire the jetting track from the corresponding data source.
As an alternative embodiment, it is default that at least 8 particles per particle spray point are assigned to the currently operating particle pool.
As an alternative embodiment, after the acquisition of the spectral data, the samples are taken at fixed step values, and then the total number of particles is acquired for each set of 512 sample point values. And judging whether the total number of the particles meets the requirement of generating the particle special effect or not according to the number of the particles corresponding to each particle injection point.
As one of alternative embodiments, an ijkmediplayer class in an ijkplayer player is called to generate a listener, the ijkmediplayer is a structural body in the ijkplayer player, and the ijkplayer player is a b-standing video player;
acquiring and returning spectrum data based on a listener;
the denoising processing specifically comprises the following steps: removing frequencies which cannot be accepted by human ears in the frequency spectrum data, and acquiring forward frequency spectrum amplitude of the rest frequency spectrum data;
and taking the obtained forward spectrum amplitude data result as a data source for the particles to be endowed with the ejection tracks.
On the basis of the technical scheme, the particles are classified, and the classification at least divides the particles into the following according to the speed as a division standard: first and second velocity particles, for example:
and the first speed is N times (N is more than or equal to 2) of the second speed, the first speed particles correspond to fast moving particles, and the second speed particles correspond to slow moving particles.
As an alternative embodiment, a fixed value of the exercise time is set, which defaults to 600ms,
for a first velocity particle, the total movement time along its trajectory is equal to a random number within 500ms plus a fixed value of the movement time,
for the second velocity particle, the total motion time along its trajectory is equal to a random number within 500ms plus a fixed value of motion time N times.
For example: the first speed is 2 times the second speed, then:
fast moving particles, the total motion time along the trajectory of the particles is a random number within 500ms +600 ms;
and slowly moving the particles, wherein the total motion time of the particles along the tracks is a random number within 500ms and +2 ms and 600 ms.
On the basis of the above technical solution, a farthest distance is set, the farthest distance refers to a point at which the particles are farthest away from the particle ejection point,
each particle starts at its corresponding particle ejection point,
a random number of +15 degrees is set from the starting point to give each particle a different emission angle.
On the basis of the technical scheme, when the particles are generated at the particle injection point, the particle generation time is recorded, for example, the current system time Sysem. cyrrentTimeMillis () is recorded as the particle generation time,
the total movement time of the particles along the track is taken as the life time,
during the continuous playing period of the audio, circularly calling a frequency spectrum callback method, and completing the drawing of the particles along the tracks of the particles by a draw method of a canvas;
and when the drawing draw method is called each time, obtaining the current system time, subtracting the particle generation time from the current system time obtained by calling the drawing draw method, and calculating the moving distance of the current particle to be redrawn according to the following formula:
(the current system time-particle generation time/life time obtained by calling the drawing draw method is equal to the moving distance/the farthest distance of the current particle to be redrawn;
and after the movement distance of the current particle to be redrawn is obtained, calculating the x-axis coordinate and the y-axis coordinate of the target point to be redrawn by the current particle based on the sine and cosine of the triangle through the x-axis coordinate and the y-axis coordinate of the starting point.
On the basis of the technical scheme, when the particles are drawn, according to the existing time of the particles, when the existing time is more than or equal to 80% of the life duration, a gradual change time value lifeTime is randomly generated for the particles, and the gradual transparent fading effect of the particles in the divergence process is controlled through the gradual change time value lifeTime.
On the basis of the technical scheme, the gradual transparent fading effect of the particles in the divergence process is controlled by the gradual time value life time, and the gradual transparent fading effect is based on the following formula:
transparency alpha percent 1- (time/life duration for which particle has been present) gradient time value life time
The transparency AlphaPercent is used to change the alpha value of the current color, for example: changing the alpha value of the current color based on the following instructions
int alpha=(int)(Color.alpha(drawColor)*alphaPercent)。
Those not described in detail in this specification are within the skill of the art.
The above description is only a preferred embodiment of the present invention, and the scope of the present invention is not limited to the above embodiment, but equivalent modifications or changes made by those skilled in the art according to the present disclosure should be included in the scope of the present invention as set forth in the appended claims.

Claims (10)

1. A method for generating a particle special effect animation is characterized by comprising the following steps:
acquiring spectral data of audio currently being played,
a default difference fitting process is performed on the spectral data,
and generating the particle special effect based on the result of the default value difference fitting process.
2. The method according to claim 1, wherein when generating the particle special effect, a pattern is selected as a reference pattern, a circumference of the reference pattern is calculated and divided equally into n particle ejection points,
dividing the frequency spectrum data according to frequency, respectively carrying out default value difference fitting processing on the frequency spectrum data of different frequencies,
the frequency spectrum data with different frequencies correspond to the particle ejection points one by one, and the control of the ejection height of the particle ejection points is realized.
3. The method of claim 2, wherein the reference pattern is a disk shape, and a cover pattern of the currently playing audio is filled in the reference pattern.
4. The method of claim 2, wherein the default equal division perimeter is 360 particle ejection points.
5. The method according to claim 2, wherein when generating the particle special effect, a standby particle pool and a current running particle pool are created in advance;
the particles sprayed by the particle spraying points belong to the current operating particle pool,
after the particles have completed their corresponding ejection trajectories, they are assigned to a spare particle pool,
and acquiring frequency spectrum data through monitoring, and taking the frequency spectrum data as a data source of the jetting track after denoising treatment, wherein the particles acquire the jetting track from the corresponding data source.
6. The method according to claim 2, wherein the particles are classified, and the classification at least divides the particles into: a first velocity particle and a second velocity particle, the first velocity being N times the second velocity;
setting a fixed value of the movement time, defaulting to 600ms,
for a first velocity particle, the total movement time along its trajectory is equal to a random number within 500ms plus a fixed value of the movement time,
for the second velocity particle, the total motion time along its trajectory is equal to a random number within 500ms plus a fixed value of motion time N times.
7. The method according to claim 2, wherein a farthest distance is set, the farthest distance being a point at which the particle is farthest from the particle ejection point,
each particle starts at its corresponding particle ejection point,
a random number of +15 degrees is set from the starting point to give each particle a different emission angle.
8. The method of claim 6, wherein when the particles are generated at the particle ejection point, the particle generation time is recorded,
the total movement time of the particles along the track is taken as the life time,
during the continuous playing period of the audio, circularly calling a frequency spectrum callback method, and completing the drawing of the particles along the tracks of the particles by a draw method of a canvas;
and when the drawing draw method is called each time, obtaining the current system time, subtracting the particle generation time from the current system time obtained by calling the drawing draw method, and calculating the moving distance of the current particle to be redrawn according to the following formula:
(the current system time-particle generation time/life time obtained by calling the drawing draw method is equal to the moving distance/the farthest distance of the current particle to be redrawn;
and after the movement distance of the current particle to be redrawn is obtained, calculating the x-axis coordinate and the y-axis coordinate of the target point to be redrawn by the current particle based on the sine and cosine of the triangle through the x-axis coordinate and the y-axis coordinate of the starting point.
9. The method as claimed in claim 8, wherein when the particle is drawn, according to the existing time of the particle, when the existing time is greater than or equal to 80% of the lifeTime, a fade time value lifeTime is randomly generated for the particle, and the fade effect of the particle in the divergence process is controlled by the fade time value lifeTime.
10. The method of claim 9, wherein the controlling of the fade effect of the particle in the divergence process by the fade time value lifeTime is based on the following formula:
transparency alpha percent 1- (time/life duration for which particle has been present) gradient time value life time
The transparency AlphaPercent is used to change the alpha value of the current color.
CN202111191688.2A 2021-10-13 2021-10-13 Particle special effect animation generation method Active CN113838168B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111191688.2A CN113838168B (en) 2021-10-13 2021-10-13 Particle special effect animation generation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111191688.2A CN113838168B (en) 2021-10-13 2021-10-13 Particle special effect animation generation method

Publications (2)

Publication Number Publication Date
CN113838168A true CN113838168A (en) 2021-12-24
CN113838168B CN113838168B (en) 2023-10-03

Family

ID=78968696

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111191688.2A Active CN113838168B (en) 2021-10-13 2021-10-13 Particle special effect animation generation method

Country Status (1)

Country Link
CN (1) CN113838168B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114880055A (en) * 2022-04-08 2022-08-09 亿玛创新网络(天津)有限公司 Bullet screen special effect realization method and device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107195310A (en) * 2017-03-05 2017-09-22 杭州趣维科技有限公司 A kind of method for processing video frequency of sound driver particle effect
CN107329980A (en) * 2017-05-31 2017-11-07 福建星网视易信息系统有限公司 A kind of real-time linkage display methods and storage device based on audio
CN107894885A (en) * 2017-11-17 2018-04-10 北京酷我科技有限公司 A kind of square frequency spectrum generation method of audio
CN109166166A (en) * 2018-09-06 2019-01-08 北京酷我科技有限公司 A kind of implementation method of diffusion particle animation
CN111326169A (en) * 2018-12-17 2020-06-23 中国移动通信集团北京有限公司 Voice quality evaluation method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107195310A (en) * 2017-03-05 2017-09-22 杭州趣维科技有限公司 A kind of method for processing video frequency of sound driver particle effect
CN107329980A (en) * 2017-05-31 2017-11-07 福建星网视易信息系统有限公司 A kind of real-time linkage display methods and storage device based on audio
CN107894885A (en) * 2017-11-17 2018-04-10 北京酷我科技有限公司 A kind of square frequency spectrum generation method of audio
CN109166166A (en) * 2018-09-06 2019-01-08 北京酷我科技有限公司 A kind of implementation method of diffusion particle animation
CN111326169A (en) * 2018-12-17 2020-06-23 中国移动通信集团北京有限公司 Voice quality evaluation method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114880055A (en) * 2022-04-08 2022-08-09 亿玛创新网络(天津)有限公司 Bullet screen special effect realization method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113838168B (en) 2023-10-03

Similar Documents

Publication Publication Date Title
US10380798B2 (en) Projectile object rendering for a virtual reality spectator
US11206339B2 (en) Spectator view into an interactive gaming world showcased in a live event held in a real-world venue
US10300361B2 (en) Ball game training
CN1163835C (en) Video game system and storage medium for storing program for use in video game system
JP2020121102A (en) Method and system for generating recording of game play of video game
US20190102941A1 (en) Venue mapping for virtual reality spectating of electronic sports
EP3465331A1 (en) Mixed reality system
US20170011554A1 (en) Systems and methods for dynamic spectating
US11783007B2 (en) Apparatus and method for generating a recording
US9744459B2 (en) Computer-readable storage medium storing information processing program, information processing device, information processing system, and information processing method
US20230001297A1 (en) Systems and Methods for Controlling Camera Perspectives, Movements, and Displays of Video Game Gameplay
CN113838168A (en) Method for generating particle special effect animation
US20130123962A1 (en) Computer-readable storage medium storing information processing program, information processing device, information processing system, and information processing method
JP2017119033A (en) Game device and program
JP6802393B2 (en) Foveal rendering optimization, delayed lighting optimization, foveal adaptation of particles, and simulation model
CN106648107A (en) VR scene control method and apparatus
EP3037918A1 (en) System and method for localizing haptic effects on a body
CN103294202A (en) 5D (Five Dimensional) presence effect desktop simulation system
US20190204917A1 (en) Intuitive haptic design
US20140342344A1 (en) Apparatus and method for sensory-type learning
CN114257849A (en) Barrage playing method, related equipment and storage medium
EP2108417B1 (en) Audio apparatus and method
US9448762B2 (en) Precognitive interactive music system
CN110442239B (en) Pear game virtual reality reproduction method based on motion capture technology
CN104315650A (en) Early warning method and early warning device for relative humidity and relative humidity early warning air conditioner

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant