CN106131567B - Ultraviolet aurora up-conversion method of video frame rate based on Lattice Boltzmann - Google Patents
Ultraviolet aurora up-conversion method of video frame rate based on Lattice Boltzmann Download PDFInfo
- Publication number
- CN106131567B CN106131567B CN201610517071.8A CN201610517071A CN106131567B CN 106131567 B CN106131567 B CN 106131567B CN 201610517071 A CN201610517071 A CN 201610517071A CN 106131567 B CN106131567 B CN 106131567B
- Authority
- CN
- China
- Prior art keywords
- particle
- aurora
- video
- image
- flowing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 239000005441 aurora Substances 0.000 title claims abstract description 101
- 238000000034 method Methods 0.000 title claims abstract description 50
- 238000006243 chemical reaction Methods 0.000 title claims abstract description 14
- 239000002245 particle Substances 0.000 claims abstract description 113
- 238000006073 displacement reaction Methods 0.000 claims abstract description 33
- 230000033001 locomotion Effects 0.000 claims abstract description 29
- 230000008569 process Effects 0.000 claims abstract description 10
- 101100117236 Drosophila melanogaster speck gene Proteins 0.000 claims description 11
- 238000013508 migration Methods 0.000 claims description 11
- 238000005315 distribution function Methods 0.000 claims description 5
- 238000007781 pre-processing Methods 0.000 claims description 4
- 229910052814 silicon oxide Inorganic materials 0.000 claims description 4
- 229910008045 Si-Si Inorganic materials 0.000 claims description 3
- 229910006411 Si—Si Inorganic materials 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 claims description 3
- 230000005012 migration Effects 0.000 claims description 3
- 238000010276 construction Methods 0.000 claims description 2
- 239000012530 fluid Substances 0.000 claims description 2
- 230000003834 intracellular effect Effects 0.000 claims description 2
- 230000003068 static effect Effects 0.000 claims description 2
- 230000017105 transposition Effects 0.000 claims description 2
- 235000008331 Pinus X rigitaeda Nutrition 0.000 claims 1
- 235000011613 Pinus brutia Nutrition 0.000 claims 1
- 241000018646 Pinus brutia Species 0.000 claims 1
- 239000000470 constituent Substances 0.000 claims 1
- 230000002123 temporal effect Effects 0.000 abstract description 6
- 238000004364 calculation method Methods 0.000 abstract description 3
- 230000000694 effects Effects 0.000 description 6
- 238000011160 research Methods 0.000 description 6
- 238000002474 experimental method Methods 0.000 description 5
- 230000000903 blocking effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 210000004027 cell Anatomy 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 102000002322 Egg Proteins Human genes 0.000 description 1
- 108010000912 Egg Proteins Proteins 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 239000004615 ingredient Substances 0.000 description 1
- 238000004020 luminiscence type Methods 0.000 description 1
- 230000035772 mutation Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 210000004681 ovum Anatomy 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/513—Processing of motion vectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/577—Motion compensation with bidirectional frame interpolation, i.e. using B-pictures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/587—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234381—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the temporal resolution, e.g. decreasing the frame rate by frame skipping
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440281—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
Abstract
The ultraviolet aurora up-conversion method of video frame rate based on Lattice Boltzmann that the invention discloses a kind of mainly solves the problems, such as to convert on the aurora video frame rate of the unsuitable non-rigid motion of existing up-conversion method of video frame rate.It realizes process are as follows: 1) choose adjacent two frame of pretreated aurora video, 2) calculates the gray scale difference of two frames;3) external force of driving two field pictures particle flow is calculated according to gray scale difference;4) using external force iterative calculation two field pictures particle displacement field, write down particle displacement field it is constant when the number of iterations n;5) withFor the number of iterations, the particle displacement field of two field pictures is recalculated, and the particle of two field pictures is made to obtain two width new images according to the particle displacement field movement after respectively recalculating respectively;6) two width new images are merged to obtain interpolated frame, completes to convert in the frame per second of ultraviolet aurora video.The present invention improves the temporal resolution of ultraviolet aurora video, can be used in the frame per second of the video of non-rigid motion converting.
Description
Technical field
The invention belongs to technical field of video processing, it is related to conversion method in the frame per second of video, can be used for being promoted ultraviolet pole
The temporal resolution of light video.
Background technique
Aurora are that the high energy particle from magnetosphere is deposited to upper atmosphere and a kind of atmosphere with neutral component collision excitation
Luminescence phenomenon, it is mainly appeared in the ring-band shape region centered on geomagnetic pole, which is called auroral oval.Auroral oval it is red
Road is important geophysical parameters to boundary and pole to boundary, has close relationship with solar wind, geomagnetic activity.And
The boundary of auroral oval changes with geomagnetic activity, studies it with helping to further appreciate that day coupling process, Cognitive Spaces
Laws of Climate Change.In addition, certain radio waves for giving off when aurora occur, can to tellurian radio communication, navigation,
Positioning and line transmission etc. cause very big influence, however when aurora generation, it can be broken into earth atmosphere huge
Energy, these energy almost can achieve capacitance summation caused by whole world various countries power plant.Therefore, aurora are carried out deep
The research entered is very important.But the research of single-frame images is also confined to the research of ultraviolet aurora at present, and be based on
The research of single image only considered the spatial information of image, but have ignored the temporal information of image, in addition, single image is ground
Study carefully relatively time consuming.Therefore, aurora video is handled, a kind of new trend for studying ultraviolet aurora from now on will be become.But by
When the ultraviolet imager that Polar satellite carries shoots aurora image, time interval is longer, about 3min or so, so that I
Aurora video temporal resolution it is lower, to lose the information in some time domains.In order to promote the time of aurora video
Resolution ratio, it is necessary to aurora video convert in frame per second, i.e., be inserted into some new picture frames in original aurora sequence.
Transfer algorithm can be roughly divided into two classes on existing video frame rate.First kind algorithm does not consider the movement of target,
But directly go out intermediate frame using the various combination interpolations of before and after frames, such as frame repeating algorithm and time domain linear/non-linear interleave method.
Available preferable effect when these methods scene without motion in video, but in the biggish situation of scene motion, it inserts
It is worth poor effect, is easy to produce motion mutation phenomenon and moving object edge blurry phenomenon.Second class algorithm uses motion information
Interpolation is carried out, estimation and interpolation filter design are combined closely, as Lee et al. Weighted adaptive proposed compensates interpolation
Algorithm, what motion compensated interpolation algorithm OBMC, Vinh T.Q. based on overlapping block that Hilman k. et al. is proposed et al. was proposed
Transfer algorithm, Lee smoothly and on the video frame rate that combines of background/foreground joint progress estimation using spatiotemporal motion vector
Et al. W.H. transfer algorithm in the frame per second based on image co-registration proposed.These algorithms overcome first kind side to a certain extent
The shortcomings that method, but these algorithms assume that the object in video is moved in constant velocity linear mostly, i.e. and rigid motion is not suitable for aurora
It is converted on video frame rate, i.e. the interleave of aurora video.In addition, these algorithms are when estimation using based on Block- matching
Method, so that the interleave image blocking artifact finally obtained is more serious.In order to eliminate blocking artifact, Shahram et al. was in 2015
It proposes and conversion method BROP in the frame per second with light stream is rebuild based on block, this method has well solved blocking artifact problem, but this
A method estimates sports ground due to using optical flow method, thus be not suitable for still non-rigid aurora video frame per second on turn
It changes.
Summary of the invention
It is an object of the invention in view of the above shortcomings of the prior art, propose a kind of aurora based on Lattice Boltzmann
Up-conversion method of video frame rate, to promote the temporal resolution of the aurora video of non-rigid motion.
Realizing the object of the invention technical thought is: being simulated using Lattice Boltzmann model LBM adjacent in aurora video
The particle dynamic changing process of two field pictures is made with obtaining the particle displacement field of two field pictures according to obtained particle displacement field
The pixel of two field pictures is migrated, and interpolation image is obtained, and carries out upper conversion with the frame per second to aurora video.Implementation step is such as
Under:
(1) captured by the ultraviolet aurora imager carried from Polar satellite in video image, choose form it is similar and when
Between upper continuous ultraviolet aurora image construction aurora video I={ I1,I2,...,Im,...,Ik, wherein ImFor in aurora video
M-th of image, m=1,2 ..., k, k are the number of image in aurora video;
(2) pretreatment that the clearing of negative value point, speck smooth, exposure mask and denoising are successively carried out to aurora video I, obtains pre- place
Aurora video S={ S after reason1,S2,...,Sm,...,Sk, wherein SmFor m-th of image in aurora video after pretreatment;
(3) the adjacent two field pictures S after pre-processing in aurora video S is choseniAnd Si+1, and calculate gray scale between the two
Poor F=Si-Si+1, wherein i=1,2 ..., k-1;
(4) by SiAnd Si+1Each pixel regard fluid particles as, according to gray scale difference F calculate driving SiParticle to Si+1Stream
Dynamic external force G;
(5) S under external force G driving is simulated using Lattice Boltzmann model LBMiParticle to Si+1The dynamic mistake of flowing
Journey, and pass through iterative calculation SiParticle to Si+1The particle displacement field of flowing;
(6) work as SiParticle to Si+1When the particle displacement field of flowing no longer changes, the number of iterations n at this time is write down;
(7) it takesAs new the number of iterations, S is recalculatediParticle to Si+1The particle displacement field of flowing, and make Si's
Particle is migrated according to the particle displacement field after recalculating, and is obtained to SiNew images C after carrying out particle migrationi;
(8) it takesAs the number of iterations, S is calculatedi+1Particle to SiThe particle displacement field of flowing, and make Si+1Particle press
It migrates, is obtained to S according to particle displacement fieldi+1New images C after carrying out particle migrationi+1;
(9) by C obtained in step (7)iWith C obtained in step (8)i+1Fusion obtains being located at S according to the following formulaiWith
Si+1Between interpolated frame:
The present invention has the advantage that
1. the present invention simulates the dynamic changing process of auroral particle using Lattice Boltzmann model, pass through iterative calculation
The particle displacement field of aurora video consecutive frame, and interpolation image is obtained according to particle displacement field, to realize aurora video
Frame per second carries out conversion, overcomes previous methods due to that can only seek the sports ground of rigid motion video, and is not suitable for aurora view
The shortcomings that being converted in frequency frame per second;
2. the present invention simultaneously using the spatial information and temporal information of ultraviolet aurora, carries out on ultraviolet aurora video for the first time
Research, overcomes in the past to the research of ultraviolet aurora due to the spatial information just with ultraviolet aurora, and can only be to single frames figure
The limitation of picture handle;
3. aurora up-conversion method of video frame rate of the invention calculate that simple, operand is small and have it is very high can be parallel
Property.
Detailed description of the invention
Fig. 1 is implementation flow chart of the invention;
Fig. 2 is adjacent two frame in aurora video to be processed in present invention experiment;
Fig. 3 is in present invention experiment to the knot after the ultraviolet aurora image preprocessing of adjacent two frame in aurora video to be processed
Fruit;
Fig. 4 is the space schematic diagram of two-dimentional nine direction structure cells in existing discrete Lattice Boltzmann model;
Fig. 5 is the experimental result that the particle displacement field of adjacent two frame in Fig. 2 is calculated with the present invention;
Fig. 6 is the experimental result that the sports ground of adjacent two frame in Fig. 2 is calculated with existing OBMC method;
Fig. 7 is the experimental result that the sports ground of adjacent two frame in Fig. 2 is calculated with existing BROP method;
Fig. 8 is the experimental result for carrying out interleave to two frame adjacent in Fig. 2 with the present invention;
Fig. 9 is the experimental result for carrying out interleave to two frame adjacent in Fig. 2 with existing OBMC method;
Figure 10 is the experimental result for carrying out interleave to two frame adjacent in Fig. 2 with existing BROP algorithm.
Specific embodiment
The contents of the present invention and effect are described further below in conjunction with attached drawing.
Referring to Fig.1, the specific implementation steps are as follows for this example:
Step 1: building aurora video I={ I1,I2,...,Im,...,Ik}。
Polar satellite is run in about 840 kilometers from the ground of track, its track passes through north and south the two poles of the earth of the earth, Polar
The ultraviolet imager carried on satellite shoots the ultraviolet aurora image of the magnanimity of different moments with the operating of Polar satellite,
This example chooses the similar and continuous in time ultraviolet aurora figure of form from ultraviolet aurora image captured by ultraviolet imager
Picture, and in chronological sequence sequentially it is aligned to composition aurora video I={ I together1,I2,...,Im,...,Ik, wherein ImFor aurora
M-th of image in video, m=1,2 ..., k, k are the number of image in aurora video.
Step 2: aurora video I is successively pre-processed.
(2a) negative value point is reset:
Since ultraviolet aurora imager will receive some interference of self-condition in shooting process, lead to the purple shot
Negative value may be presented in the gray value of certain pixels of outer aurora image, these pixels can bring certain shadow to follow-up study
It rings, therefore, needs to carry out it negative value point clear operation when studying ultraviolet aurora image.Its specific method is to find aurora view
M-th of image I in frequencymPixel of the middle gray value less than 0, and the gray value of these pixels is assigned a value of 0 again, obtain new figure
As Qm, m=1,2 ..., k;
(2b) speck is smooth:
Due to the interference of the mountain range and the forest that also will receive ground in the shooting process of ultraviolet aurora imager etc., cause
The ultraviolet aurora image shot will appear some excessively bright zonules, referred to as speck, since these specks are not that aurora are true
Therefore positive ingredient, is smoothly necessary these specks so will also result in interference to the analysis for carrying out aurora.Its
Specific method is: first finding out image QmGray average μ and gray standard deviation σ, find out image QmMiddle gray value is greater than threshold value Th=
The connected region that the pixel of+3 σ of μ is constituted, this connected region is exactly speck, and the gray value of pixel in speck is replaced with μ,
Obtain smoothed image Pm;
(2c) exposure mask:
Due to there is the pixel of many redundancies in ultraviolet aurora image, these are eliminated when studying ultraviolet aurora image
The influence of pixel, effective pixel in image of giving prominence to the key points are 114 with major semiaxis, and the ellipse that semi-minor axis is 100 goes to intercept
Image Pm, i.e., by image PmGrey scale pixel value in ellipse retains, by image PmGrey scale pixel value outside ellipse sets 0,
Image Y after obtaining exposure maskm;
(2d) denoising:
Since ultraviolet aurora picture noise is stronger, causes its contrast lower, be unfavorable for studying, so will be to ultraviolet aurora
Image is denoised, and specific method is:
(2d1) uses morphology componential analysis MCA by image YmResolve into target morphology scholar's layer YmO *With background morphology
Sublayer YmB *, i.e. Ym=YmO *+YmB *, by target morphology scholar's layer YmO *It is carried out sparse obtaining Y with 1 normmO *Corresponding rarefaction representation
Dictionary ΨO;
(2d2) is by YmTarget morphology scholar layer YmO *With background morphology sublayer YmB *As variable, following MCA is solved most
Optimization problem:
min||ΨO TYmO *||1+||ΨB TYmB *||1,s.t.Ym=YmO *+YmB *, wherein T indicates transposition,
Y is obtained by solving above formulamTarget morphology scholar layer YmO *Optimal expression YmOWith background morphology sublayer YmB *
Optimal expression YmB;
(2d3) is by YmOAs YmAuroral oval morphology sublayer, by YmBAs the form of noise scholar's layer, remove the form of noise
Scholar's layer YmB, auroral oval morphology sublayer is exported, Y is obtainedmImage S after denoisingm, finally obtain pretreated aurora view
Frequency is S={ S1,S2,...,Sm,...,Sk}。
Step 3: calculating the gray scale difference between the adjacent two field pictures after pre-processing in aurora video S.
From the adjacent image S of two frames is chosen after pretreatment in aurora video SiAnd Si+1, wherein SiIt is in aurora video i-th
Frame image, Si+1For i+1 frame image in aurora video, as shown in Figure 3;
S is calculated according to following equationiAnd Si+1Between gray scale difference:
F=Si-Si+1。
Step 4: driving S is calculated according to gray scale difference FiParticle to Si+1The external force G of flowing.
SiParticle to Si+1The external force of flowing can be by SiTo Si+1Distance and SiParticle movement speed weighting it is mutually multiplied
It arrives, wherein SiTo Si+1Distance be exactly the S acquired in step 3iAnd Si+1Between gray scale difference F, and the speed root that particle is mobile
It is obtained according to the moving direction of particle:
If d is SiAnd Si+1The moving direction of middle particle is, it is specified that when d=0, and particle is static, d=1, when 2 ..., 8, particle
Along eight directions of nine direction structure cells of typical two dimension in LBM model shown in Fig. 4, i.e. first 1, second, direction
Direction 2 ..., the 8th direction 8 it is mobile, the speed that particle is moved along this eight directions are as follows:
Wherein, d=1, when 2,3,4,D=5, when 6,7,8,
According to SiThe movement speed of middle particle calculates driving SiParticle to Si+1The external force G of flowing:
Step 5: calculating SiParticle to Si+1The particle displacement field of flowing.
(5a) constructs external force G and drives lower SiParticle to Si+1The LBM EVOLUTION EQUATION of the dynamic process of flowing:
It enablesFor SiPosition where upper particle,For t moment SiOnThe particle density distribution that place is moved along direction d
Function,For t moment SiOnLocate particle equilibrium distribution function, i.e., numerical value when particle density reaches balance, Δ t and
Δ h is respectively time step and spatial mesh size, and ξ is slack time, i.e., first particle density intracellular tend to balance state when it is used when
Between, then external force G driving under SiParticle to Si+1The LBM EVOLUTION EQUATION of the dynamic process of flowing are as follows:
(5b) needs to give to pass through the LBM EVOLUTION EQUATION in iterative solution (5a)WithJust
Initial value,
Wherein,S when for original stateiOnThe particle density at place, R are adjustable parameter, R ∈ [0,1];
(5c) solves the LBM EVOLUTION EQUATION in (5a) by iteration, until particle equilibrium distribution functionDo not exist
Until variation, S is acquirediParticle to Si+1The particle displacement field of flowing
Wherein,
Step 6: calculating separately SiAnd Si+1New images C after carrying out particle migrationiAnd Ci+1。
(6a), which writes down, works as SiParticle to Si+1The number of iterations n when the particle displacement field of flowing no longer changes;
(6b) is acquired to SiNew images C after carrying out particle migrationi:
It takesAs new the number of iterations, S is recalculated according to the method for step 5iParticle to Si+1The particle position of flowing
Field is moved, and makes SiParticle according to after recalculating particle displacement field migrate, obtain to SiNew images after carrying out particle migration
Ci。
(6c) is acquired to Si+1New images C after carrying out particle migrationi+1:
It takesAs the number of iterations, S is calculated according to the method for step 5i+1Particle to SiThe particle displacement field of flowing, and make
Si+1Particle according to particle displacement field migrate, obtain to Si+1New images C after carrying out particle migrationi+1。
Step 7: obtaining final interpolated frame.
By C obtained in step 6iAnd Ci+1It is merged, obtains being located at SiAnd Si+1Between interpolated frame
It completes to convert in the frame per second of ultraviolet aurora video.
Effect of the invention is further illustrated by following emulation experiment:
1, experiment condition
Hardware platform: Intel Core i3,2.93GHz, 3.45GB RAM;
Software platform: the MATLAB R2011b under Windows7 operating system;
Experimental data: the ultraviolet aurora imager that ultraviolet aurora video used in the present invention is carried from Polar satellite
The data in captured in December, 1996, wherein every frame image size is 228*200.
2, experiment content and result
Choose ultraviolet aurora imager continuous ultraviolet pole captured by December 7th, 1996 that Polar satellite carries
Light image constitutes aurora video and is tested, and adjacent two frames aurora image therein is as shown in Figure 2.
Emulation 1: two frames aurora image adjacent in such as Fig. 2 is pre-processed, pretreated result as shown in figure 3, from
Fig. 3 can be seen that pretreated image almost without noise and speck, is conducive to subsequent processing.
Emulation 2: the particle displacement field of adjacent two field pictures after pretreatment is sought with the present invention.
Regard each pixel of two field pictures adjacent in Fig. 3 as particle, Lattice Boltzmann model cellular in comparison diagram 4
Space schematic diagram, each particle have 9 directions of motion, calculate particle along the displacement in 9 directions according to step 5, what is obtained is adjacent
The particle displacement field of two field pictures is as shown in Figure 5, wherein Fig. 5 (a) and 5 (b) is respectively two kinds of performance shapes of particle displacement field
Formula.
From fig. 5, it can be seen that the Motion Particles of adjacent two field pictures are mainly distributed on the edge and pole of ultraviolet aurora image
Light ovum region, this is consistent with practical theory, illustrates that the present invention can successfully calculate the sports ground of non-rigid motion video.
Emulation 3: with adjacent two field pictures in Fig. 3 of the existing motion compensation interleave OBMC method calculating based on overlapping block
Sports ground is as shown in Figure 6.
The existing motion compensation interleave OBMC method based on overlapping block only calculates one-way movement as can be seen from Figure 6
, i.e. the sports ground of the i-th frame to i+1 frame, and the Motion Particles of adjacent two field pictures are distributed on whole picture aurora image, without
Be auroral oval region, this does not meet practical theory: the Motion Particles of ultraviolet aurora image are concentrated mainly on auroral oval region.Explanation
OBMC method cannot be accurately obtained the sports ground of non-rigid motion video.
Emulation 4: with adjacent two field pictures in Fig. 3 of the existing motion compensation interleave BROP method calculating based on overlapping block
Sports ground is as shown in Figure 7, wherein Fig. 7 (a) and 7 (b) is respectively two kinds of forms of expression of sports ground.
The Motion Particles of adjacent two field pictures are also distributed about on whole picture aurora image as can be seen from Figure 7, this does not meet reality
Border is theoretical, illustrates that BROP method can not be accurately obtained the sports ground of non-rigid motion video.
Emulation 5: interpolation is carried out to two field pictures adjacent after pretreatment with the present invention, the interpolation of obtained adjacent two field pictures
Frame, as shown in Figure 8.
From figure 8, it is seen that the interpolation image that the present invention obtains there's almost no distortion, and the ash of the interleave image obtained
Degree also between adjacent two field pictures, illustrates to convert in the frame per second of the invention that can preferably realize ultraviolet aurora video.
Emulation 6: interpolation is carried out to two field pictures adjacent after pretreatment with existing OBMC method, obtained adjacent two field pictures
Interpolated frame, as shown in Figure 9.
From fig. 9, it can be seen that there are apparent blocking artifacts for the obtained interpolation image of existing OBMC method, and deposited in edge
It is being distorted, is illustrating that OBMC method is not suitable for carrying out converting in frame per second to ultraviolet aurora video.
Emulation 7: interpolation is carried out to two field pictures adjacent after pretreatment with existing BROP method, obtains adjacent two field pictures
Interpolated frame, as shown in Figure 10.
From fig. 10 it can be seen that the interpolation image distortion that existing BROP method obtains is more serious, illustrate OBMC method
Be not suitable for that ultraviolet aurora video is carried out to convert in frame per second.
To sum up, the present invention can be realized preferably converts in the frame per second of the ultraviolet aurora video of non-rigid motion, and existing skill
Sports ground of the art due to cannot accurately calculate adjacent two field pictures, so being not suitable for carrying out turning in frame per second to ultraviolet aurora video
It changes.
Claims (4)
1. the ultraviolet aurora up-conversion method of video frame rate based on Lattice Boltzmann, comprising:
(1) from video image captured by the ultraviolet imager that Polar satellite carries, it is similar and continuous in time to choose form
Ultraviolet aurora image construction aurora video I={ I1,I2,...,Im,...,Ik, wherein ImFor m-th of figure in aurora video
Picture, m=1,2 ..., k, k are the number of image in aurora video;
(2) pretreatment that the clearing of negative value point, speck smooth, exposure mask and denoising are successively carried out to aurora video I, after obtaining pretreatment
Aurora video S={ S1,S2,...,Sm,...,Sk, wherein SmFor m-th of image in aurora video after pretreatment;
(3) the adjacent two field pictures S after pre-processing in aurora video S is choseniAnd Si+1, and calculate gray scale difference F=between the two
Si-Si+1, wherein i=1,2 ..., k-1;
(4) by SiAnd Si+1Each pixel regard fluid particles as, according to gray scale difference F calculate driving SiParticle to Si+1Flowing
External force G;
(5) S under external force G driving is simulated using Lattice Boltzmann model LBMiParticle to Si+1The dynamic process of flowing, and
By iterating to calculate SiParticle to Si+1The particle displacement field of flowing;
(6) work as SiParticle to Si+1When the particle displacement field of flowing no longer changes, the number of iterations n at this time is write down;
(7) it takesAs new the number of iterations, S is recalculatediParticle to Si+1The particle displacement field of flowing, and make SiParticle
According to the particle displacement field migration after recalculating, obtain to SiNew images C after carrying out particle migrationi;
(8) it takesAs the number of iterations, S is calculatedi+1Particle to SiThe particle displacement field of flowing, and make Si+1Particle according to grain
Sub- displacement field migration, obtains to Si+1New images C after carrying out particle migrationi+1;
(9) by C obtained in step (7)iWith C obtained in step (8)i+1Fusion obtains being located at S according to the following formulaiAnd Si+1
Between interpolated frame:
2. the ultraviolet aurora up-conversion method of video frame rate according to claim 1 based on Lattice Boltzmann, wherein walking
Suddenly the pretreatment of the clearing of negative value point, speck smooth, exposure mask and denoising is successively carried out in (2) to aurora video I, as follows into
Row:
(2a) finds m-th of image I in aurora videomPixel of the middle gray value less than 0, and again by the gray value of these pixels
It is assigned a value of 0, obtains new image Qm, m=1,2 ..., k;
(2b) finds out new image QmIn speck, and use QmGray average μ replace the gray value of pixel in speck, obtain
Smoothed image Pm, wherein speck is defined as the connected region that pixel of the gray value greater than+3 σ of threshold value Th=μ is constituted, and wherein σ is
QmGray standard deviation;
(2c) is 114 with major semiaxis, and the ellipse that semi-minor axis is 100 removes interception smoothed image Pm, image Y after obtaining exposure maskm;
(2d) is to the image Y after exposure maskmIt is denoised:
The rarefaction representation dictionary group { Ψ of (2d1) building morphology constituent analysis MCAO,ΨB, the condition of MCA is set as Ym=YmO *
+YmB *, wherein YmO *For YmTarget morphology scholar layer, ΨOFor YmO *Corresponding rarefaction representation dictionary, YmB *For YmBackground form
Scholar's layer, ΨBFor YmB *Corresponding rarefaction representation dictionary;
(2d2) is by YmTarget morphology scholar layer YmO *With background morphology sublayer YmB *As variable, solves following MCA and optimize
Problem:
min||ΨO TYmO *||1+||ΨB TYmB *||1,s.t.Ym=YmO *+YmB *, wherein T indicates transposition,
Y is obtained by solving above formulamTarget morphology scholar layer YmO *Optimal expression YmOWith background morphology sublayer YmB *Most
Excellent expression YmB;
(2d3) is by YmOAs YmAuroral oval morphology sublayer, by YmBAs the form of noise scholar's layer, remove the form of noise scholar
Layer YmB, auroral oval morphology sublayer is exported, Y is obtainedmImage S after denoisingm, finally obtaining pretreated aurora video is
S={ S1,S2,...,Sm,...,Sk}。
3. the ultraviolet aurora up-conversion method of video frame rate according to claim 1 based on Lattice Boltzmann, wherein walking
Suddenly driving S is calculated according to gray scale difference F in (4)iParticle to Si+1The external force G of flowing is carried out as follows:
Wherein, d SiAnd Si+1The moving direction of middle particle is, it is specified that when d=0, and particle is static, d=1, when 2 ..., 8, particle edge
Eight directions of typical two-dimentional nine direction structure cells in LBM model, i.e. first 1, second, direction direction 2 ...,
8th direction 8 is mobile, and F is adjacent two frames SiAnd Si+1Gray scale difference, edFor SiAnd Si+1The movement speed of middle particle, and
Wherein, d=1, when 2,3,4,D=5, when 6,7,8,
4. the ultraviolet aurora up-conversion method of video frame rate according to claim 1 based on Lattice Boltzmann, wherein walking
Suddenly the S under external force G driving is simulated using Lattice Boltzmann model LBM in (5)iParticle to Si+1The dynamic process of flowing, and
By iterating to calculate SiParticle to Si+1The particle displacement field of flowing carries out as follows:
(5a) building S under external force G drivingiParticle to Si+1The LBM EVOLUTION EQUATION of the dynamic process of flowing:
Wherein,For SiThe position of upper particle,For t moment SiOnThe particle density distribution function that place is moved along direction d,For t moment SiOnLocate particle equilibrium distribution function, Δ t and Δ h are respectively time step and spatial mesh size, and ξ is pine
Relax the time, i.e., first particle density intracellular tend to balance state when time used;
(5b) settingWithInitial value are as follows:
Wherein,S when for original stateiOnThe particle density at place, R are adjustable parameter, R ∈ [0,1];
(5c) solves the LBM EVOLUTION EQUATION in (5a) by iteration, until particle equilibrium distribution functionDo not changing
Until, obtain SiParticle to Si+1The particle displacement field of flowing
Wherein,
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610517071.8A CN106131567B (en) | 2016-07-04 | 2016-07-04 | Ultraviolet aurora up-conversion method of video frame rate based on Lattice Boltzmann |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610517071.8A CN106131567B (en) | 2016-07-04 | 2016-07-04 | Ultraviolet aurora up-conversion method of video frame rate based on Lattice Boltzmann |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106131567A CN106131567A (en) | 2016-11-16 |
CN106131567B true CN106131567B (en) | 2019-01-08 |
Family
ID=57468257
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610517071.8A Expired - Fee Related CN106131567B (en) | 2016-07-04 | 2016-07-04 | Ultraviolet aurora up-conversion method of video frame rate based on Lattice Boltzmann |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106131567B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107045722B (en) * | 2017-03-27 | 2019-07-30 | 西安电子科技大学 | Merge the video signal process method of static information and multidate information |
CN113225589B (en) * | 2021-04-30 | 2022-07-08 | 北京凯视达信息技术有限公司 | Video frame insertion processing method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103425833A (en) * | 2013-08-07 | 2013-12-04 | 湖南大学 | Implement method of parallel fluid calculation based on entropy lattice Boltzmann model |
CN104202603A (en) * | 2014-09-23 | 2014-12-10 | 浙江工商大学 | Motion vector field generation method applied to video frame rate up-conversion |
-
2016
- 2016-07-04 CN CN201610517071.8A patent/CN106131567B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103425833A (en) * | 2013-08-07 | 2013-12-04 | 湖南大学 | Implement method of parallel fluid calculation based on entropy lattice Boltzmann model |
CN104202603A (en) * | 2014-09-23 | 2014-12-10 | 浙江工商大学 | Motion vector field generation method applied to video frame rate up-conversion |
Non-Patent Citations (2)
Title |
---|
Lattice Boltzmann Method for Real-Time Simulation of Lava Flows;D. van Raay AND T. Bossomaier;《IEEE Proceedings of the Geometric Modeling and Imaging--New Trends (GMAI06)》;20060724;第1-7页 |
融合显著信息的LDA极光图像分类;韩冰 等;《软件学报》;20131115;第24卷(第11期);第2758-2766页 |
Also Published As
Publication number | Publication date |
---|---|
CN106131567A (en) | 2016-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110119780B (en) | Hyper-spectral image super-resolution reconstruction method based on generation countermeasure network | |
Wang et al. | Esrgan: Enhanced super-resolution generative adversarial networks | |
Xin et al. | Wavelet-based dual recursive network for image super-resolution | |
CN114119444B (en) | Multi-source remote sensing image fusion method based on deep neural network | |
CN110717856A (en) | Super-resolution reconstruction algorithm for medical imaging | |
CN106251297A (en) | A kind of estimation based on multiple image fuzzy core the rebuilding blind super-resolution algorithm of improvement | |
Tu et al. | SWCGAN: Generative adversarial network combining swin transformer and CNN for remote sensing image super-resolution | |
Li et al. | Underwater image high definition display using the multilayer perceptron and color feature-based SRCNN | |
CN113822383B (en) | Unmanned aerial vehicle detection method and system based on multi-domain attention mechanism | |
CN106131567B (en) | Ultraviolet aurora up-conversion method of video frame rate based on Lattice Boltzmann | |
CN104199627A (en) | Gradable video coding system based on multi-scale online dictionary learning | |
CN105550993A (en) | Multiple transform domain based super-resolution reconstruction method | |
He et al. | Remote sensing image super-resolution using deep–shallow cascaded convolutional neural networks | |
Yang et al. | Residual and dense unet for under-display camera restoration | |
CN116486074A (en) | Medical image segmentation method based on local and global context information coding | |
CN114627269A (en) | Virtual reality security protection monitoring platform based on degree of depth learning target detection | |
CN114140357A (en) | Multi-temporal remote sensing image cloud region reconstruction method based on cooperative attention mechanism | |
CN113724134A (en) | Aerial image blind super-resolution reconstruction method based on residual distillation network | |
He et al. | A lightweight multi-scale feature integration network for real-time single image super-resolution | |
Huang et al. | Image restoration from patch-based compressed sensing measurement | |
CN111179171A (en) | Image super-resolution reconstruction method based on residual module and attention mechanism | |
CN116895037A (en) | Frame insertion method and system based on edge information and multi-scale cross fusion network | |
CN114693755B (en) | Non-rigid registration method and system for multimode image maximum moment and space consistency | |
CN110675320A (en) | Method for sharpening target image under spatial parameter change and complex scene | |
Zhang et al. | Image super-resolution via RL-CSC: when residual learning meets convolutional sparse coding |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20190108 Termination date: 20190704 |
|
CF01 | Termination of patent right due to non-payment of annual fee |