CN117218275A - Frame interpolation method and device based on real-time rendering - Google Patents

Frame interpolation method and device based on real-time rendering Download PDF

Info

Publication number
CN117218275A
CN117218275A CN202311085038.9A CN202311085038A CN117218275A CN 117218275 A CN117218275 A CN 117218275A CN 202311085038 A CN202311085038 A CN 202311085038A CN 117218275 A CN117218275 A CN 117218275A
Authority
CN
China
Prior art keywords
image frame
optical flow
frame
target
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311085038.9A
Other languages
Chinese (zh)
Inventor
张婧
杨思鹏
金小刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202311085038.9A priority Critical patent/CN117218275A/en
Publication of CN117218275A publication Critical patent/CN117218275A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Generation (AREA)

Abstract

The invention discloses a method and a device for interpolating frames based on real-time rendering, wherein the method comprises the following steps: acquiring a first image frame and a second image frame which are continuously rendered; calculating a first optical flow of the first image frame to the second image frame and a second optical flow of the second image frame to the first image frame; estimating a third optical flow of the first image frame to the target interpolated frame based on the first optical flow, and estimating a fourth optical flow of the second image frame to the target interpolated frame based on the second optical flow; updating a color buffer area of the second image frame according to the third optical flow, and mapping the updated second image frame according to the fourth optical flow; calculating an accurate third optical flow from the first image frame to the target interpolated frame based on the mapping result, and generating the target interpolated frame based on the accurate third optical flow; and inserting the target insertion frame into the list to be displayed, and waiting for image display according to the sequence of the list to be displayed. The method effectively saves the calculation cost and the rendering time when the scene is rendered, and generates the high-quality plug-in frame image.

Description

Frame interpolation method and device based on real-time rendering
Technical Field
The invention relates to the technical field of computer vision and image processing, in particular to a method and a device for interpolating frames based on real-time rendering.
Background
With the continuous improvement of scene rendering image quality and complexity and the introduction of real-time ray tracing technology and complex global illumination algorithm, the cost of real-time rendering on calculation is higher and higher, which causes great challenges to the performance. An increase in the amount of computation may cause a decrease in the frame rate, thereby greatly affecting the user's experience when viewing a rendered screen of a game or the like.
In order to alleviate the rendering pressure of the GPU and provide a smoother picture experience, researchers have proposed methods such as a super sampling technique and a frame interpolation technique to improve the rendering performance. Wherein the frame interpolation technique increases the frame rate by generating additional intermediate frames. However, research into frame interpolation techniques for real-time rendering has not been widespread.
The academic document Neural frame interpolation for rendered content [ J ]// ACM Transactions On Graphics (TOG), 2021, proposes an interpolated frame method for rendering contents, which can generate high quality intermediate frames, but has excessive computational overhead, which cannot be applied to real-time rendering.
2022, NVIDIA introduced DLSS3, which is based on DLSS super-resolution technology, combined with optical multi-frame generation technology to generate a new frame, and applies NVIDIA Reflex low-delay technology to optimize response speed, but the method can only run on NVIDIA GeForce RTX series display cards, which limits its application range.
In addition to reducing the high overhead that may be incurred by computation in real-time rendering, how to build accurate interpolation models and optimization strategies is also a major challenge.
In an actual scene, the picture change may be very complex, possibly leading to some visual artifacts in the interpolated frame. The full use of the information of the previous and subsequent frames to generate the intermediate frame requires accurate motion vectors to be obtained. It is a challenging task to obtain and optimize the motion vectors between the intermediate and input frames. Accurate estimation of motion vectors is critical to generating high quality intermediate frames.
Chinese patent document CN111277780a discloses a method and apparatus for improving the effect of frame insertion, the method comprising: determining an interpolation frame image between two adjacent frame original images based on an optical flow frame interpolation method; judging whether the interpolation frame quality of the interpolation frame image meets the standard or not; and updating the interpolation frame image when the interpolation frame quality of the interpolation frame image does not reach the standard. The scheme can improve the frame inserting quality of the interpolation frame image obtained based on the optical flow frame inserting method to a certain extent, and effectively improves the watching experience of a user on the high-frame-rate video obtained after frame inserting. The above schemes are primarily directed to video interpolation, rather than rendering scenes in real time.
In real-time rendering, the motion and change of objects may be more complex and difficult to predict due to scene dynamics. In the real-time rendering environment, how to accurately acquire and utilize optical flow information, and how to improve the frame inserting efficiency and reduce the computational complexity while guaranteeing the frame inserting quality, the above solution does not disclose any solution.
Disclosure of Invention
In view of the above, the present invention aims to provide a method and apparatus for interpolating frames based on real-time rendering, which aims to solve the problems of excessive calculation overhead, inaccurate interpolation frame quality and calculation complexity in the real-time rendering in the prior art, so as to improve rendering performance and user experience.
In order to achieve the above purpose, the technical scheme provided by the invention is as follows:
in a first aspect, an embodiment of the present invention provides a method for interpolating frames based on real-time rendering, including:
acquiring a first image frame and a second image frame which are continuously rendered;
calculating a first optical flow from a first image frame to a second image frame and a second optical flow from the second image frame to the first image frame, wherein the optical flows are used for indicating the motion information of each pixel point from one image frame to the other image frame, and realizing the correspondence of pixel levels;
estimating a third optical flow of the first image frame to the target interpolated frame based on the first optical flow, and estimating a fourth optical flow of the second image frame to the target interpolated frame based on the second optical flow;
updating a color buffer area of the second image frame according to the third optical flow, and mapping the updated second image frame according to the fourth optical flow;
calculating an accurate third optical flow from the first image frame to the target interpolated frame based on the mapping result, and generating the target interpolated frame based on the accurate third optical flow;
and inserting the target insertion frame into the list to be displayed, and waiting for image display according to the sequence of the list to be displayed.
According to an embodiment of the present invention, the first optical flow F is obtained by using a reprojection method t→t+1 And a second optical flow F t+1→t
Wherein, the method using the re-projection obtains a first optical flow F t→t+1 Comprising:
obtaining the coordinates P of each pixel of the first image frame t
Using the observation matrix V when rendering the second image frame t+1 And a camera projection matrix K for converting coordinates P of each pixel of the first image frame t Projecting onto the second image frame to obtain corresponding coordinates pi in the second image frame t→t+1 (P t ) The calculation formula is as follows:
wherein pi t→t+1 Point P representing the frame t of the first image t The re-projection coefficient mapped to the second image frame t+1, K representing the camera projection matrix, V t+1 Representing the observation matrix of the second image frame,representing the inverse, K, of the observation matrix of the first image frame -1 Representing the inverse of the camera projection matrix;
pairs in the second image frameResponse coordinate pi t→t+1 (P t ) Coordinates P with each pixel of the first image frame t Subtracting and performing linear operation to obtain a first optical flow F t→t+1 The calculation formula is as follows:
F t→t+1 [p]=π t→t+1 (P t )-P t
wherein F is t→t+1 [p]Two-dimensional coordinates p= (p) representing specific pixels in a first optical flow from a first image frame t to a second image frame t+1 x ,p y ) Vector, p x Representing the coordinates of the pixel on the x-axis, p y Representing the coordinates of the pixel on the y-axis.
Second optical flow F from second image frame to first image frame t+1→t Also using the method of reprojection, the specific procedure is according to a first optical flow F t→t+1 And so on.
According to an embodiment of the present invention, the calculation formula is as follows, wherein the third optical flow from the first image frame to the target frame is estimated based on the first optical flow, and the fourth optical flow from the second image frame to the target frame is estimated based on the second optical flow:
F t→t+0.5 =0.5·F t→t+1
F t+1→t+0.5 =0.5·F t+1→t
wherein F is t→t+1 A first optical flow representing a first image frame t to a second image frame t+1, F t→t+0.5 A third optical flow representing the first image frame t to the target interpolated frame t+0.5, F t+1→t A second optical flow representing the second image frame t+1 to the first image frame t, F t+1→t+0.5 A fourth optical flow representing the second image frame t+1 to the target interpolated frame t+0.5.
According to an embodiment of the present invention, the updating the color buffer of the second image frame according to the third optical flow and mapping the updated second image frame according to the fourth optical flow includes:
according to the third optical flow F t→t+0.5 Updating the color buffer of the second image frame according to the following calculation formula:
I t+1 [p t+1 ]=F t→t+0.5 [p t ]
wherein I is t+1 [p t+1 ]Representing the two-dimensional coordinate p in the second image frame t+1 t+1 Pixel color information of F t→t+0.5 [p t ]Representing a two-dimensional coordinate p of a third optical flow from the first image frame t to the target interpolated frame t+0.5 in the first image frame t Corresponding pixel point color information;
according to the fourth optical flow F t+1→t+0.5 For the corresponding three-dimensional coordinate point P in the updated second image frame t+1 Mapping is carried out, and the calculation formula is as follows:
p′ t+0.5 =(P t+1 +F t+1→t+0.5 ).xy
F t+0.5 [p′ t+0.5 ]=I t+1 [p t+1 ]
wherein p' t+0.5 Representing two-dimensional coordinates p in the second image frame t+1 Through the fourth optical flow F t+1→t+0.5 Mapping and calculating two-dimensional coordinates roughly in target interpolated frame t+0.5, P t+1 Representing two-dimensional coordinates p in the second image frame t+1 Corresponding three-dimensional coordinates, xy represents the values of the abscissa and the ordinate of the motion information corresponding to the coordinates in the fourth optical flow, F t+0.5 [p′ t+0.5 ]Representing the colored information at the corresponding coordinates in the generated precise third optical flow.
According to an embodiment of the present invention, the calculating, based on the mapping result, the accurate third optical flow from the first image frame to the target interpolated frame includes:
removing the blocked points in the same pixel in the mapping result;
and interpolating the mapping result to obtain an accurate third optical flow.
According to an embodiment of the present invention, the removing the blocked point in the same pixel in the mapping result includes:
traversing each pixel in the mapping result to obtain a distribution point of each pixel;
and comparing the depth values of the distributed points scattered in the same pixel, eliminating the points far away from the camera, and only keeping the points nearest to the camera.
According to an embodiment of the present invention, the generating the target frame based on the accurate third optical flow includes:
assigning pixel color information at the two-dimensional coordinates in the first image frame to pixel color information at the two-dimensional coordinates of the target interpolated frame based on the accurate third optical flow;
and traversing searching the uncolored pixels in the target interpolation frame from near to far, selecting the colored pixels closest to the target interpolation frame, and assigning values to the colored pixels to generate the target interpolation frame.
In a second aspect, an embodiment of the present invention further provides a frame interpolation apparatus based on real-time rendering, including:
the image frame acquisition module is used for acquiring a first image frame and a second image frame which are continuously rendered;
an image frame optical flow calculation module for calculating a first optical flow of the first image frame to the second image frame and a second optical flow of the second image frame to the first image frame,
the optical flow is used for indicating the motion information of each pixel point from one image frame to another image frame, so that the pixel level correspondence is realized;
the target frame inserting estimation module is used for estimating a third optical flow from the first image frame to the target frame inserting based on the first optical flow and estimating a fourth optical flow from the second image frame to the target frame inserting based on the second optical flow;
the color buffer updating and mapping module is used for updating the color buffer area of the second image frame according to the third optical flow and mapping the updated second image frame according to the fourth optical flow;
a target interpolated frame generating module for calculating an accurate third optical flow from the first image frame to the target interpolated frame based on the mapping result, and generating the target interpolated frame based on the accurate third optical flow;
and the display module is used for inserting the target insertion frame into the list to be displayed and waiting for image display according to the sequence of the list to be displayed.
In a third aspect, an embodiment of the present invention further provides a computing device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the memory stores a decoration with a tag, and when the processor executes the computer program, the processor implements the steps of the method for interpolating frames based on real-time rendering.
In a fourth aspect, an embodiment of the present invention further provides a computer readable medium having a computer program stored thereon, wherein the computer program is processed to implement the steps of the above-described method for interpolating frames based on real-time rendering.
Compared with the prior art, the invention has the beneficial effects that at least the following steps are included:
aiming at the frame inserting task of real-time rendering, the frame inserting method based on real-time rendering can save the calculation cost and the rendering time when rendering the scene, generate a frame inserting image with higher quality, improve the frame rate and the user experience of real-time rendering, reduce the rendering pressure of the GPU and increase the picture smoothness.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for interpolating frames based on real-time rendering provided by an embodiment;
FIG. 2 is a schematic diagram of an interpolated frame device based on real-time rendering according to an embodiment;
FIG. 3 is a schematic diagram of a computing device according to an embodiment.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the detailed description is presented by way of example only and is not intended to limit the scope of the invention.
Some of the terms used in this example are as follows:
t represents the time of the first image frame, t+1 represents the time of the second image frame, and t+0.5 represents the time of the target insertion frame, i.e., the intermediate frame. p= (p x ,p y ) Representing the two-dimensional coordinates of a pixel, where p x Representing the coordinates of the pixel on the x-axis, p y Representing the coordinates of the pixel on the y-axis, let Z represent the depth buffer of the image frame, then p= (P) x ,p y ,Z[p]) Representing the three-dimensional coordinates corresponding to the pixel.
Fig. 1 is a flowchart of a method for interpolating frames based on real-time rendering according to an embodiment. As shown in fig. 1, an embodiment provides a method for interpolating frames based on real-time rendering, including:
s101, acquiring a first image frame and a second image frame which are continuously rendered.
Specifically, in this embodiment, in the Unity3D environment, a Camera (Camera) component is utilized to render a scene, and a continuously rendered first image frame I is obtained t And a second image frame I t+1
S102, calculating a first optical flow from the first image frame to the second image frame and a second optical flow from the second image frame to the first image frame. The optical flow is used for indicating the motion information of each pixel point from one image frame to another image frame, and the pixel level correspondence is realized.
In the present embodiment, the first optical flow F from the first image frame to the second image frame is obtained by using the method of re-projection t→t+1 And obtaining a second optical flow F from the second image frame to the first image frame by using the method of re-projection t+1→t
Obtaining a first optical flow F from a first image frame to a second image frame using a method of reprojection t→t+1 Comprising the following steps:
(1) Obtaining the coordinates P of each pixel of the first image frame t
(2) Using the observation matrix V when rendering the second image frame t+1 And a camera projection matrix K for converting coordinates P of each pixel of the first image frame t Projected onto a second image frame to obtain a second image frameCorresponding coordinates pi in two image frames t→t+1 (P t ) The calculation formula is as follows:
wherein pi t→t+1 Point P representing the frame to be first image t Reprojection coefficients mapped to the second image frame, K representing the camera projection matrix, V t+1 Representing the observation matrix of the second image frame,representing the inverse, K, of the observation matrix of the first image frame -1 Representing the inverse of the camera projection matrix.
(3) Corresponding coordinates pi in the second image frame t→t+1 (P t ) Coordinates P with each pixel of the first image frame t Subtracting and performing linear operation to obtain a first optical flow F t→t+1 The calculation formula is as follows:
F t→t+1 [p]=π t→t+1 (P t )-P t
wherein F is t→t+1 [p]Two-dimensional coordinates p= (p) representing specific pixels in a first optical flow from a first image frame to a second image frame x ,p y ) Vector, p x Representing the coordinates of the pixel on the x-axis, p y Representing the coordinates of the pixel on the y-axis.
Specifically, in this embodiment, the UV coordinates of the texture are sampled on the GPU by each pixel of the first image frame to obtain the coordinates of the standardized device, and the standardized device is projected to the world space coordinate system to obtain the coordinates of the world coordinate system corresponding to each pixel in the first image frame.
And projecting the coordinates of the world coordinate system by using an observation matrix and a projection matrix of a camera when the second image frame is rendered, performing perspective division to obtain corresponding standardized equipment coordinates in the second image frame, subtracting the standardized equipment coordinates in the original first image frame from the point, performing linear operation to obtain a first optical flow, and storing the first optical flow as a texture.
In the present embodiment, the second optical flow F from the second image frame to the first image frame is obtained by using the method of re-projection t+1→t Obtaining a first optical flow F from a first image frame to a second image frame by using the method of re-projection t→t+1 The process is similar, and specifically comprises the following steps:
obtaining a second optical flow F from a second image frame to a first image frame using a method of re-projection t+1→t Comprising the following steps:
(1) Obtaining the coordinates P of each pixel of the second image frame t+1
(2) Using the observation matrix V when rendering the first image frame t And a camera projection matrix K for converting coordinates P of each pixel of the second image frame t+1 Projecting onto the first image frame to obtain corresponding coordinate pi in the first image frame t+1→t (P t+1 ) The calculation formula is as follows:
wherein pi t+1→t Point P representing the frame to be second image t+1 The re-projection coefficient mapped to the first image frame t, K denotes the camera projection matrix, V t Representing the observation matrix of the first image frame,representing the inverse, K, of the observation matrix of the second image frame -1 Representing the inverse of the camera projection matrix.
(3) Corresponding coordinates pi in the first image frame t+1→t (P t+1 ) Coordinates P with each pixel of the second image frame t+1 Subtracting and carrying out linear transportationCalculating to obtain a second optical flow F t+1→t The calculation formula is as follows:
F t+1→t [p]=π t+1→t (P t+1 )-P t+1
wherein F is t+1→t [p]Two-dimensional coordinates p= (p) representing specific pixels in a second optical flow from the second image frame to the first image frame x ,p y ) Vector, p x Representing the coordinates of the pixel on the x-axis, p y Representing the coordinates of the pixel on the y-axis.
Specifically, in this embodiment, the UV coordinates of the texture are sampled on the GPU by each pixel of the second image frame to obtain the coordinates of the standardized device, and the standardized device is projected to the world space coordinate system to obtain the coordinates of the world coordinate system corresponding to each pixel in the second image frame.
And projecting the coordinates of the world coordinate system by using an observation matrix and a projection matrix of a camera when the first image frame is rendered, performing perspective division to obtain corresponding standardized equipment coordinates in the first image frame, subtracting the standardized equipment coordinates in the original second image frame from the point, performing linear operation to obtain a second optical flow, and storing the second optical flow as a texture.
S103, estimating a third optical flow from the first image frame to the target frame based on the first optical flow, and estimating a fourth optical flow from the second image frame to the target frame based on the second optical flow.
In this embodiment, based on the assumption that the interval time between two adjacent frames is very short at a high frame rate on differentiation, the object can be considered to perform uniform linear motion between two frames, specifically, the calculation formula is as follows:
F t→t+0.5 =0.5·F t→t+1
F t+1→t+0.5 =0.5·F t+1→t
wherein F is t→t+1 A first optical flow representing a first image frame to a second image frame, F t→t+0.5 A third optical flow representing the first image frame to the target interpolated frame, F t+1→t A second optical flow representing the second image frame to the first image frame, F t+1→t+0.5 And a fourth optical flow representing the second image frame to the target interpolated frame.
S104, updating a color buffer area of the second image frame according to the third optical flow, and mapping the updated second image frame according to the fourth optical flow.
In the present embodiment, according to the third optical flow F t→t+0.5 Updating the color buffer of the second image frame according to the following calculation formula:
I t+1 [p t+1 ]=F t→t+0.5 [p t ]
wherein I is t+1 [p t+1 ]Representing the two-dimensional coordinate p in the second image frame t+1 t+1 Pixel color information of F t→t+0.5 [p t ]Representing a two-dimensional coordinate p of a third optical flow from the first image frame to the target interpolated frame in the first image frame t Corresponding pixel color information.
According to the fourth optical flow F t+1→t+0.5 For the corresponding three-dimensional coordinate point P in the updated second image frame t+1 Mapping is carried out, and the calculation formula is as follows:
P′ t+0.5 =P t+1 +F t+1→t+0.5
p′ t+0.5 =P′ t+0.5 .xy
F t+0.5 [p′ t+0.5 ]=I t+1 [p t+1 ]
wherein p' t+0.5 Representing two-dimensional coordinates p in the second image frame t+1 Through the fourth optical flow F t+1→t+0.5 Mapping and calculating two-dimensional coordinates, P 'roughly in target insertion frame' t+0.5 Represents p' t+0.5 Corresponding three-dimensional coordinates, P t+1 Representing two-dimensional coordinates p in the second image frame t+1 Corresponding three-dimensional coordinates, xy represents the values of the abscissa and the ordinate of the motion information corresponding to the coordinates in the fourth optical flow, F t+0.5 [p′ t+0.5 ]Representing the colored information at the corresponding coordinates in the generated precise third optical flow.
Note that P 'obtained at this time' t+0.5 Not necessarily in the center of the image.
Specifically, in the present embodiment, by the third optical flow F t→t+0.5 Updating color buffers of a second image frameFlushing zone, according to fourth optical flow F t+1→t+0.5 For the corresponding three-dimensional coordinate point P in the updated second image frame t+1 Mapping is carried out, and the point set with optical flow information at the irregular positions is obtained.
S105, calculating accurate third optical flow from the first image frame to the target interpolated frame based on the mapping result, and generating the target interpolated frame based on the accurate third optical flow.
Processing the point set obtained after mapping, removing the blocked points in the same pixel, selecting the colored pixels with the closest distance to the rest uncolored pixels, and assigning values, wherein the color information of each pixel point is the accurate third optical flow F obtained by final interpolation t→t+0.5 Generating target interpolated frame I based on accurate third optical flow t+0.5 Comprising the following steps:
(1) And eliminating the blocked points in the same pixel in the mapping result.
And traversing each pixel in the mapping result to obtain a distribution point of each pixel.
Specifically, the set of points mapped in S104 is traversed, and assuming that each pixel is centered on the local coordinates of (0.5 ), the area range of each pixel is defined between (0, 0) and (1, 1), and the three-dimensional coordinates P 'obtained in S104 are collected' t+0.5 Distribution points within each pixel range.
And eliminating points far away from the camera by comparing the depth values of distributed points scattered in the same pixel, and only keeping the closest point to the camera in the Unity 3D.
(2) And interpolating the mapping result to obtain an accurate third optical flow.
Performing nearest neighbor interpolation calculation and rounding assignment on the rest pixels which are not subjected to interpolation to obtain a third optical flow F t→t+0.5
(3) Generating target interpolated frame I based on accurate third optical flow t+0.5
Specifically, the first image frame I is based on the accurate third optical flow t Two-dimensional coordinates p in (2) t Assigning pixel color information at the position to the target interpolation frame I t+0.5 Is a two-dimensional coordinate p of t+0.5 Pixels at the pixel locationsThe calculation formula is as follows:
p t+0.5 =p t +F t→t+0.5 .xy
I t+0.5 [p t+0.5 ]=I t [p t ]
wherein p is t+0.5 Representing two-dimensional coordinates in the target frame, F t→t+0.5 An accurate third optical flow from the first image frame t to the target interpolation frame is represented, xy represents the values of the abscissa and the ordinate of the motion information corresponding to the coordinate in the accurate third optical flow, I t+0.5 [p t+0.5 ]Representing two-dimensional coordinates p in a target frame t+0.5 Color information at, I t [p t ]Representing the two-dimensional coordinates p of the first image frame t Color information at the location.
Pair I t+0.5 [p t+0.5 ]The uncolored pixels in the image are subjected to traversal searching from near to far, and the colored pixels closest to the distance are selected for assignment, so that a target interpolation frame I is generated t+0.5
S106, inserting the target insertion frame into the list to be displayed, and waiting for image display according to the sequence of the list to be displayed.
The frame inserting method in the embodiment can control the extra time consumption of inserting a frame within 16.7ms, the generated image quality meets the average of PSNR (Peak Signal-to-Noise Ratio) and SSIM (Structural Similarity ) of 30dB, the frame rate of real-time rendering of a more complex scene can be effectively improved, and the picture smoothness is increased.
Based on the same inventive concept, embodiments also provide an interpolated frame device 200 based on real-time rendering.
As shown in fig. 2, an image frame acquisition module 210, an image frame optical flow calculation module 220, a target interpolation frame estimation module 230, a color buffer update and mapping module 240, a target interpolation frame generation module 250, and a display module 260.
Wherein, the image frame acquisition module 210 is configured to acquire a first image frame and a second image frame that are continuously rendered;
an image frame optical flow calculation module 220 for calculating a first optical flow of the first image frame to the second image frame and a second optical flow of the second image frame to the first image frame;
the target interpolated frame estimation module 230 estimates a third optical flow of the first image frame to the target interpolated frame based on the first optical flow and estimates a fourth optical flow of the second image frame to the target interpolated frame based on the second optical flow;
the color buffer updating and mapping module 240 is configured to update a color buffer of the second image frame according to the third optical flow, and map the updated second image frame according to the fourth optical flow;
a target interpolated frame generating module 250 for calculating an exact third optical flow from the first image frame to the target interpolated frame based on the mapping result and generating the target interpolated frame based on the exact third optical flow;
the display module 260 is configured to insert the target frame into the list to be displayed, and wait for image display according to the order of the list to be displayed.
It should be noted that, when the interpolation frame device based on real-time rendering performs real-time rendering interpolation frame, the above-mentioned division of each functional module should be used for illustration, and the above-mentioned functional allocation may be performed by different functional modules according to the need, that is, the internal structure of the terminal or the server is divided into different functional modules, so as to complete all or part of the above-mentioned functions. In addition, the embodiment of the present invention provides an interpolation frame device based on real-time rendering and an interpolation frame method embodiment based on real-time rendering, which are the same conception, and detailed implementation procedures of the interpolation frame device based on real-time rendering are shown in the embodiment of the interpolation frame method based on real-time rendering, and are not repeated here.
For the rendering of a scene, a real-time ray tracing technology and complex global illumination and shadow calculation are often required to be introduced, the complexity and dynamic change of the scene, the motion and change of an object and the change of illumination all increase the rendering cost, and the device in the embodiment accurately estimates the motion vector between an intermediate frame and an input frame by analyzing and processing the information of the existing front frame and the existing back frame, so that the intermediate frame is generated and inserted, the rendering time cost caused by the related complex calculation and the processing of the scene is reduced, and the picture smoothness is increased.
Based on the same inventive concept, the embodiments also provide a computing device, as shown in fig. 3, which includes a processor, an internal bus, a network interface, a memory, and a storage, and may include hardware required by other services, as shown in a hardware level. The processor reads the corresponding computer program from the memory to the memory and then runs the computer program to realize the frame interpolation method based on real-time rendering, and the method comprises the following steps:
s101, acquiring a first image frame and a second image frame which are continuously rendered;
s102, calculating a first optical flow from the first image frame to the second image frame and a second optical flow from the second image frame to the first image frame;
s103, estimating a third optical flow from the first image frame to the target frame based on the first optical flow, and estimating a fourth optical flow from the second image frame to the target frame based on the second optical flow.
S104, updating a color buffer area of the second image frame according to the third optical flow, and mapping the updated second image frame according to the fourth optical flow;
s105, calculating an accurate third optical flow from the first image frame to the target interpolated frame based on the mapping result, and generating the target interpolated frame based on the accurate third optical flow;
s106, inserting the target insertion frame into the list to be displayed, and waiting for image display according to the sequence of the list to be displayed.
The memory may be a volatile memory at the near end, such as RAM, or a nonvolatile memory, such as ROM, FLASH, floppy disk, mechanical hard disk, or a remote storage cloud. The processor may be a Central Processing Unit (CPU), a Microprocessor (MPU), a Digital Signal Processor (DSP), or a Field Programmable Gate Array (FPGA), i.e., the steps of the method for interpolating frames based on real-time rendering may be implemented by these processors.
Based on the same inventive concept, an embodiment further provides a computer readable storage medium having stored thereon a computer program, which when processed and executed, implements the above-mentioned method for interpolating frames based on real-time rendering, comprising:
s101, acquiring a first image frame and a second image frame which are continuously rendered;
s102, calculating a first optical flow from the first image frame to the second image frame and a second optical flow from the second image frame to the first image frame;
s103, estimating a third optical flow from the first image frame to the target frame based on the first optical flow, and estimating a fourth optical flow from the second image frame to the target frame based on the second optical flow;
s104, updating a color buffer area of the second image frame according to the third optical flow, and mapping the updated second image frame according to the fourth optical flow;
s105, calculating an accurate third optical flow from the first image frame to the target interpolated frame based on the mapping result, and generating the target interpolated frame based on the accurate third optical flow;
s106, inserting the target insertion frame into the list to be displayed, and waiting for image display according to the sequence of the list to be displayed.
The computer readable storage medium may be, among other things, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In the process of generating the intermediate frame by the computing device, the computing device of the invention uses the GPU (graphics processor) to accelerate the computation and improve the rendering performance, generates required optical flow and image data by the GPU parallel computation, reduces the computation amount in the frame inserting process, improves the rendering efficiency and speed, and realizes more efficient frame inserting computation.
The foregoing detailed description of the preferred embodiments and advantages of the invention will be appreciated that the foregoing description is merely illustrative of the presently preferred embodiments of the invention, and that no changes, additions, substitutions and equivalents of those embodiments are intended to be included within the scope of the invention.

Claims (10)

1. A method of interpolating frames based on real-time rendering, comprising:
acquiring a first image frame and a second image frame which are continuously rendered;
calculating a first optical flow from a first image frame to a second image frame and a second optical flow from the second image frame to the first image frame, wherein the optical flows are used for indicating the motion information of each pixel point from one image frame to the other image frame, and realizing the correspondence of pixel levels;
estimating a third optical flow of the first image frame to the target interpolated frame based on the first optical flow, and estimating a fourth optical flow of the second image frame to the target interpolated frame based on the second optical flow;
updating a color buffer area of the second image frame according to the third optical flow, and mapping the updated second image frame according to the fourth optical flow;
calculating an accurate third optical flow from the first image frame to the target interpolated frame based on the mapping result, and generating the target interpolated frame based on the accurate third optical flow;
and inserting the target insertion frame into the list to be displayed, and waiting for image display according to the sequence of the list to be displayed.
2. The method for interpolating frames based on real-time rendering according to claim 1, wherein the first optical flow F is obtained by using a reprojection method t→t+1 And a second optical flow F t+1→t Wherein the method using re-projection obtains a first optical flow F t→t+1 Comprising:
obtaining the coordinates P of each pixel of the first image frame t
Using the observation matrix V when rendering the second image frame t+1 And a camera projection matrix K for converting coordinates P of each pixel of the first image frame t Projecting onto the second image frame to obtain corresponding coordinates pi in the second image frame t→t+1 (P t ) The calculation formula is as follows:
wherein pi t→t+1 Point P representing the frame t of the first image t The re-projection coefficient mapped to the second image frame t+1, K representing the camera projection matrix, V t+1 Viewing matrix representing a second image frame, V t -1 Representing the inverse, K, of the observation matrix of the first image frame -1 Representing the inverse of the camera projection matrix;
corresponding coordinates pi in the second image frame t→t+1 (P t ) Coordinates P with each pixel of the first image frame t Subtracting and performing linear operation to obtain a first optical flow F t→t+1 The calculation formula is as follows:
F t→t+1 [p]=π t→t+1 (P t )-P t
wherein F is t→t+1 [p]Two-dimensional coordinates p= (p) representing specific pixels in a first optical flow from a first image frame t to a second image frame t+1 x ,p y ) Vector, p x Representing the coordinates of the pixel on the x-axis, p y Representing the coordinates of the pixel on the y-axis.
3. The method for interpolating frames based on real-time rendering of claim 1, wherein said estimating a third optical flow of a first image frame to a target frame based on a first optical flow and estimating a fourth optical flow of a second image frame to a target frame based on a second optical flow is calculated as follows:
F t→t+0 . 5 =0.5·F t→t+1
F t+1→t+0 . 5 =0.5·F t+1→t
wherein F is t→t+1 A first optical flow representing a first image frame t to a second image frame t+1, F t→t+0.5 A third optical flow representing the first image frame t to the target interpolated frame t+0.5, F t+1→t A second optical flow representing the second image frame t+1 to the first image frame t, F t+1→t+0.5 A fourth optical flow representing the second image frame t+1 to the target interpolated frame t+0.5.
4. The method of interpolating frames based on real-time rendering of claim 1, wherein updating the color buffer of the second image frame according to the third optical flow and mapping the updated second image frame according to the fourth optical flow, includes:
according to the third optical flow F t→t+0.5 Updating the color buffer of the second image frame according to the following calculation formula:
I t+1 [p t+1 ]=F t→t+0.5 [p t ]
wherein I is t+1 [p t+1 ]Representing the two-dimensional coordinate p in the second image frame t+1 t+1 Pixel color information of F t→t+0.5 [p t ]Representing a two-dimensional coordinate p of a third optical flow from the first image frame t to the target interpolated frame t+0.5 in the first image frame t Corresponding pixel point color information;
according to the fourth optical flow F t+1→t+0.5 For the corresponding three-dimensional coordinate point P in the updated second image frame t+1 Mapping is carried out, and the calculation formula is as follows:
p′ t+0.5 =(P t+1 +F t+1→t+0.5 ).xy
F t+0.5 [p′ t+0.5 ]=I t+1 [p t+1 ]
wherein p' t+0.5 Representing two-dimensional coordinates p in the second image frame t+1 Through the fourth optical flow F t+1→t+0.5 Mapping and calculating two-dimensional coordinates roughly in target interpolated frame t+0.5, P t+1 Representing two-dimensional coordinates p in the second image frame t+1 Corresponding three-dimensional coordinates, xy represents the values of the abscissa and the ordinate of the motion information corresponding to the coordinates in the fourth optical flow, F t+0.5 [p′ t+0.5 ]Representing the colored information at the corresponding coordinates in the generated precise third optical flow.
5. The method of interpolating frames based on real-time rendering of claim 1, wherein calculating an accurate third optical flow from the first image frame to the target interpolation frame based on the mapping result, includes:
removing the blocked points in the same pixel in the mapping result;
and interpolating the mapping result to obtain an accurate third optical flow.
6. The method for interpolating frames based on real-time rendering of claim 5, wherein said culling the blocked points in the same pixel in the mapping result includes:
traversing each pixel in the mapping result to obtain a distribution point of each pixel;
and comparing the depth values of the distributed points scattered in the same pixel, eliminating the points far away from the camera, and only keeping the points nearest to the camera.
7. The method for interpolating frames based on real-time rendering of claim 1, wherein said generating a target interpolated frame based on an accurate third optical flow includes:
assigning pixel color information at the two-dimensional coordinates in the first image frame to pixel color information at the two-dimensional coordinates of the target interpolated frame based on the accurate third optical flow;
and traversing searching the uncolored pixels in the target interpolation frame from near to far, selecting the colored pixels closest to the target interpolation frame, and assigning values to the colored pixels to generate the target interpolation frame.
8. An interpolated frame device based on real-time rendering, comprising:
the image frame acquisition module is used for acquiring a first image frame and a second image frame which are continuously rendered;
the image frame optical flow calculation module is used for calculating a first optical flow from a first image frame to a second image frame and a second optical flow from the second image frame to the first image frame, wherein the optical flows are used for indicating the motion information of each pixel point from one image frame to the other image frame and realizing the correspondence of pixel levels;
the target frame inserting estimation module is used for estimating a third optical flow from the first image frame to the target frame inserting based on the first optical flow and estimating a fourth optical flow from the second image frame to the target frame inserting based on the second optical flow;
the color buffer updating and mapping module is used for updating the color buffer area of the second image frame according to the third optical flow and mapping the updated second image frame according to the fourth optical flow;
a target interpolated frame generating module for calculating an accurate third optical flow from the first image frame to the target interpolated frame based on the mapping result, and generating the target interpolated frame based on the accurate third optical flow;
and the display module is used for inserting the target insertion frame into the list to be displayed and waiting for image display according to the sequence of the list to be displayed.
9. A computing device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the memory has a labeled ornament stored therein, the processor implementing the steps of the real-time rendering-based interpolated frame method of any of claims 1-7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being processed and executed, implements the steps of the real-time rendering based interpolation frame method of any of claims 1-7.
CN202311085038.9A 2023-08-25 2023-08-25 Frame interpolation method and device based on real-time rendering Pending CN117218275A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311085038.9A CN117218275A (en) 2023-08-25 2023-08-25 Frame interpolation method and device based on real-time rendering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311085038.9A CN117218275A (en) 2023-08-25 2023-08-25 Frame interpolation method and device based on real-time rendering

Publications (1)

Publication Number Publication Date
CN117218275A true CN117218275A (en) 2023-12-12

Family

ID=89041621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311085038.9A Pending CN117218275A (en) 2023-08-25 2023-08-25 Frame interpolation method and device based on real-time rendering

Country Status (1)

Country Link
CN (1) CN117218275A (en)

Similar Documents

Publication Publication Date Title
JP3052681B2 (en) 3D video generation device
US8040352B2 (en) Adaptive image interpolation for volume rendering
CN107358645B (en) Product three-dimensional model reconstruction method and system
US8878849B2 (en) Horizon split ambient occlusion
GB2475944A (en) Correction of estimated axes of elliptical filter region
CN112652046B (en) Game picture generation method, device, equipment and storage medium
WO2024022065A1 (en) Virtual expression generation method and apparatus, and electronic device and storage medium
CN113905147B (en) Method and device for removing tremble of marine monitoring video picture and storage medium
Do et al. Immersive visual communication
CN116248955A (en) VR cloud rendering image enhancement method based on AI frame extraction and frame supplement
CN114862725A (en) Method and device for realizing motion perception fuzzy special effect based on optical flow method
CN110766617A (en) Rendering acceleration method with dynamic blurring and sampling quantity reduction function
Mäkitalo et al. Systematic evaluation of the quality benefits of spatiotemporal sample reprojection in real-time stereoscopic path tracing
CN113496506A (en) Image processing method, device, equipment and storage medium
CN117218275A (en) Frame interpolation method and device based on real-time rendering
CN116630523A (en) Improved real-time shadow rendering method based on shadow mapping algorithm
Wang et al. Virtual view synthesis without preprocessing depth image for depth image based rendering
CN112203074B (en) Camera translation new viewpoint image generation method and system based on two-step iteration
CN111127588B (en) DirectX-based large data volume parameter curve playback method
CN111260544A (en) Data processing method and device, electronic equipment and computer storage medium
JP2002260003A (en) Video display device
Hofer et al. An End-to-End System for Real-Time Dynamic Point Cloud Visualization
Ma et al. Hierarchical octree and sub-volume texture block projection for GPU accelerated ray casting volume rendering
CN108876912A (en) Three-dimensional scenic physics renders method and its system
Wang et al. Local and nonlocal flow-guided video inpainting

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination