CN112596843A - Image processing method, image processing device, electronic equipment and computer readable storage medium - Google Patents
Image processing method, image processing device, electronic equipment and computer readable storage medium Download PDFInfo
- Publication number
- CN112596843A CN112596843A CN202011594887.3A CN202011594887A CN112596843A CN 112596843 A CN112596843 A CN 112596843A CN 202011594887 A CN202011594887 A CN 202011594887A CN 112596843 A CN112596843 A CN 112596843A
- Authority
- CN
- China
- Prior art keywords
- target
- image
- original frame
- target original
- frame image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 23
- 238000012545 processing Methods 0.000 title claims abstract description 19
- 230000008859 change Effects 0.000 claims description 71
- 238000010586 diagram Methods 0.000 claims description 24
- 238000002156 mixing Methods 0.000 claims description 6
- 238000000034 method Methods 0.000 abstract description 32
- 238000005516 engineering process Methods 0.000 description 5
- 238000009877 rendering Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 208000003028 Stuttering Diseases 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000011022 operating instruction Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/452—Remote windowing, e.g. X-Window System, desktop virtualisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/08—Protocols specially adapted for terminal emulation, e.g. Telnet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440281—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The application provides an image processing method, an image processing device, electronic equipment and a computer readable storage medium, and relates to the technical field of image processing. The method comprises the following steps: determining a display frame rate of the image data, and determining a target original frame image according to the target frame rate; generating a corresponding target intermediate frame image according to the target original frame image; and inserting the target intermediate frame image between the target original frame images to generate target image data. According to the method and the device, the target original frame images needing frame interpolation are determined by acquiring the display frame rate and the target frame rate of the image data to be displayed, the target intermediate frame images are generated based on the adjacent target original frame images, and the target intermediate frame images are inserted between the corresponding adjacent target original frame images, so that the refreshing frame rate of a desktop can meet the preset requirement under the condition of transmitting the same graphic data, the smoothness of desktop image display is effectively improved, and the display is continuous and smooth.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, an electronic device, and a computer-readable storage medium.
Background
Cloud desktop technology typically virtualizes a user's desktop on a cloud platform (server side), and the user connects to the virtual desktop through a protocol related to the client device, and there is no substantial difference in user experience between using the virtual desktop and using a conventional local desktop. The cloud desktop technology greatly reduces the requirements on the hardware performance of the client device, and the client device can be a thin client, a tablet, a mobile phone and other simple portable devices. From the perspective of an enterprise, the cloud desktop can realize that all data are stored in a cloud platform which is strictly controlled, and the information safety is ensured.
In cloud desktops and similar scenarios, the fluency of the desktop determines the user's experience. At present, methods mainly adopted for improving the fluency of the desktop are as follows: the method for compressing the graphic data comprises the steps of installing a high-performance display card at a server side, directly mapping the high-performance display card to a virtual machine, and then directly displaying the high-performance display card through a client side. However, these methods all depend on internet transmission performance, uncertainty of network transmission, and graphics data processing speed, which may cause problems, mainly manifested by the occurrence of stutter on the desktop of the client or tearing of the dynamic image during the use process.
Therefore, in the existing cloud desktop technology, in order to improve desktop fluency, a high-performance display card at a server side is required, the requirement on network speed is high, and problems such as jamming or tearing of dynamic images are easy to occur, so that improvement is required.
Disclosure of Invention
The purpose of the present application is to at least solve one of the above technical defects, especially, in the existing cloud desktop technology, in order to improve desktop fluency, a high-performance graphics card at a server side is required, the network speed requirement is high, and the problems of stuttering and tearing of dynamic images are likely to occur.
In a first aspect, an image processing method is provided, which includes:
acquiring image data to be displayed, and determining the display frame rate of the image data;
determining at least two target original frame images in the image data based on the relationship between the display frame rate and a preset target frame rate;
generating corresponding target intermediate frame images according to any two adjacent target original frame images in at least two target original frame images;
and inserting the target intermediate frame image between any two corresponding adjacent target original frame images to generate target image data.
As an alternative embodiment of the present application, generating a corresponding target intermediate frame image according to any two adjacent target original frame images includes:
the method comprises the steps that a target original frame image with the front display time in any two adjacent target original frame images is taken as a first target original frame image, and a target original frame image with the back display time is taken as a second target original frame image;
determining a change area of the second target original frame image compared with the first target original frame image;
a target intermediate frame image is generated based on the changed region and the first target original frame image.
As an alternative embodiment of the present application, determining a change area of the second target original frame image compared to the first target original frame image includes:
acquiring a first tree diagram and a second tree diagram, wherein the first tree diagram is used for representing the structural relationship of each display element in a first target original frame image, and the second tree diagram is used for representing the structural relationship of each display element in a second target original frame image;
determining a node of the second dendrogram which changes compared with the first dendrogram, and determining a display element of the second target original frame image which changes compared with the first target original frame image based on the changed node;
the changed region is determined based on the changed display element.
As an alternative embodiment of the present application, generating a target intermediate frame image based on the changed region and the first target original frame image includes:
determining the type of the display element changed in the change area;
when the type of the change of the display element is a position change type, generating a target intermediate frame image based on the position of the display element in the change area and the position of the display element in the first target original frame image;
when the type of the change of the display element is an image change type, a target intermediate frame image is generated based on the image of the display element in the change area and the image of the display element in the first target original frame image.
As an optional embodiment of the present application, determining at least two target original frame images in image data based on a relationship between a display frame rate and a preset target frame rate includes:
determining the number of target intermediate frame images required to be inserted in unit time based on the relation between the display frame rate and a preset target frame rate;
and determining the target original frame images based on the number of the target intermediate frame images and the display frame rate.
As an alternative embodiment of the present application, generating a corresponding target intermediate frame image according to any two adjacent target original frame images includes:
and mixing pixel points in any two adjacent target original frame images according to a preset proportion to generate a corresponding target intermediate frame image.
As an alternative embodiment of the present application, generating a corresponding target intermediate frame image according to any two adjacent target original frame images includes:
calculating the weighted average value of each pixel point in two adjacent target original frame images according to a preset weight;
and generating the target intermediate frame image according to the weighted average value.
In a second aspect, there is provided an image processing apparatus comprising:
the image data acquisition module is used for acquiring image data to be displayed and determining the display frame rate of the image data;
the target original frame image determining module is used for determining at least two target original frame images in the image data based on the relation between the display frame rate and a preset target frame rate;
the target intermediate frame image generation module is used for generating corresponding target intermediate frame images according to any two adjacent target original frame images in at least two target original frame images;
and the target image generation module is used for inserting the target intermediate frame image between any two corresponding adjacent target original frame images to generate target image data.
In a third aspect, an electronic device is provided, which includes:
a processor, a memory, and a bus;
a bus for connecting the processor and the memory;
a memory for storing operating instructions;
and the processor is used for executing the image processing method by calling the operation instruction.
In a fourth aspect, there is provided a computer readable storage medium having stored thereon at least one instruction, at least one program, set of codes, or set of instructions, which is loaded into and executed by a processor to perform the image processing method described above.
According to the method and the device, the display frame rate and the target frame rate of the image data to be displayed are obtained, the target original frame images needing frame interpolation are determined, the target intermediate frame images are generated based on the adjacent target original frame images, the target intermediate frame images are inserted between the corresponding adjacent target original frame images, the refreshing frame rate of a desktop can meet the preset requirement under the condition that the same graphic data are transmitted, the smoothness of desktop image display is effectively improved, and the display is continuous and smooth.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram illustrating generation of a target intermediate frame image according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a method for generating a target intermediate frame image according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating a position change of a display element according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram illustrating a graphical variation of a display element according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of generating a target inter-frame image by superposition according to an embodiment of the present disclosure;
fig. 7 is a schematic flowchart of a method for determining a change area according to an embodiment of the present application;
FIG. 8 is a flowchart illustrating a method for generating a target inter-frame image according to a change type according to an embodiment of the present application;
FIG. 9 is a schematic diagram of generating a target inter-frame image according to a position change according to an embodiment of the present disclosure;
FIG. 10 is a schematic diagram of generating a target inter-frame image by a graphics change according to an embodiment of the present application;
fig. 11 is a schematic flowchart of a method for determining a target original frame image according to an embodiment of the present disclosure;
fig. 12 is a schematic flowchart of a method for generating a target intermediate frame image based on an average value of pixel points according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
The above and other features, advantages and aspects of various embodiments of the present application will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The terms referred to in this application will first be introduced and explained:
(1) a display frame rate, which is a frequency at which bitmap images in units of frames continuously appear on the display, and is used to represent the number of image frames displayed in a unit time in the desktop display screen;
(2) the target original frame images are images corresponding to image data to be displayed, wherein target intermediate frame images need to be inserted between two adjacent target original frame images;
(3) the target intermediate frame image is an image generated according to two adjacent target original frame images and is used for being inserted between the two adjacent target original frame images;
(4) a tree-like graph is a tree-like structure in which display elements (such as window frames, title bars, menus, buttons, sub-windows, views, and the like) in an image are nodes, and each node records graphic information and position information of the display element corresponding to the node.
The image processing method provided by the embodiment of the application can be applied to a scene cooperatively rendered by a cloud-end GUI (Graphical User Interface). The existing collaborative rendering of end cloud GUI can be divided into two ways: the first mode is as follows: the server side is responsible for rendering all desktop GUIs, rendered desktop frames are transmitted and synchronized to the client side through the Internet, and the client side can directly display the rendered desktop frames without performing rendering operation; the second way is: the client participates in the rendering work of the desktop, the server transmits and synchronizes related instructions, primitives and related graphics resource (such as texture and the like) data to the client through the Internet, the client and the server together complete the rendering operation of the desktop frame image, and the client displays the final desktop frame image. In the final original desktop frame image of the client obtained by the two methods, the display frame rate may not meet the requirement of the target frame rate, so that the display frame of the client is relatively fuzzy. In order to solve the above problems, a high-performance display card is generally installed at a server side by a method of compressing graphic data, and is directly mapped to a virtual machine, and then is directly displayed through a client side. However, the above method depends on internet transmission performance, uncertainty of network transmission, and graphics data processing speed, which may cause some problems, mainly manifested in that the desktop of the client end is stuck or the dynamic image is torn during use.
The application provides an image processing method, an image processing device, an electronic device and a computer-readable storage medium, which aim to solve the above technical problems in the prior art.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
An embodiment of the present application provides an image processing method, as shown in fig. 1, the method includes:
step S101, obtaining image data to be displayed, and determining the display frame rate of the image data.
In the embodiment of the present application, the image data to be displayed refers to image data, such as video data, sent by a server and required to be displayed at a client; the server is a server, or a plurality of servers, or a virtualization platform, or a cloud computing service center, and the client is a terminal device with a video playing function, for example, the user terminal may be a mobile phone, a tablet computer, an e-book reader, smart glasses, a smart watch, a laptop portable computer, a desktop computer, or the like. After acquiring the image data to be displayed, it is necessary to confirm a display frame rate of the image data, and as can be understood, the display frame rate refers to a display frame rate at which the client displays the image data after receiving the image data sent by the server. For example, the server transmits image data to be displayed to the client, wherein the frame rate of the image data to be displayed is 30 frames per second.
Step S102, determining at least two target original frame images in the image data based on a relationship between the display frame rate and a preset target frame rate.
In this embodiment of the application, the preset target frame rate may be set by a user at a client, or may be set at a server, after the target frame rate is set, after the client receives image data to be displayed sent by the server, the image data needs to be displayed according to the target frame rate, but the display frame rate of the image data may not meet the requirement of the target frame rate, at this time, frame interpolation needs to be performed on the image data, and before frame interpolation, at least two target original frame images of the image data need to be determined based on a relationship between the display frame rate and the target frame rate. For example. If the preset target frame rate is 60 frames per second, a frame of target intermediate frame image needs to be inserted into each of two non-adjacent original frame images to satisfy the preset target frame rate, and all the original frame images in the image data with display are target original frame images.
Step S103, for any two adjacent target original frame images in the at least two target original frame images, generating corresponding target intermediate frame images according to any two adjacent target original frame images.
In the implementation of the present application, for any two adjacent target original frame images, an intermediate inserted target intermediate frame image is generated according to the two adjacent target original frame images, where the generation of the target intermediate frame image may be determined according to client performance requirements, for example, when client hardware performance is poor or requirements for image display are low, a transparency-based frame blending algorithm may be adopted to blend the two adjacent target original frame images based on a transparency ratio, such as weighting and blending 50% of each pixel in the two adjacent target original frame images to obtain a target intermediate frame image; or when the client has a high requirement on image display, the image can be processed by adopting a motion compensation algorithm and an optical flow method based on motion recognition, the changed objects in the two adjacent target original frame images are subjected to motion recognition or the change trend of each pixel point is calculated, and a target intermediate frame image is obtained through calculation, so that a better frame interpolation effect is obtained.
For the embodiment of the present application, for convenience of illustration, taking a specific embodiment as an example, as shown in fig. 2, two adjacent target original frame images are 201 and 202, respectively, as a more conveniently described embodiment, the target original frame image 201 is a black image, RGB values of all pixel points of the full image are (0,0,0) the target original frame image 202 is a white image, and RGB values of all pixel points of the full image are (255,255,255), when the intermediate target frame image 203 is generated according to the target original frame image 201 and the target original frame image 202, the RGB values of the pixel points in the target original frame image 201 and the target original frame image 202 may be averaged to obtain a gray target intermediate frame image 203, and the RGB values of the pixel points of the full image are all (128,128,128). Of course, this embodiment is only one embodiment that is convenient for description, and in actual implementation, the colors of the image may be richer.
And step S104, inserting the target intermediate frame image between any two corresponding adjacent target original frame images to generate target image data.
In the embodiment of the present application, after generating the target intermediate frame image, the target intermediate frame image is inserted between two corresponding adjacent target original frame images to generate target image data, wherein the display frame rate of the target image data is the same as the preset target frame rate.
According to the method and the device, the target original frame images needing frame interpolation are determined by acquiring the display frame rate and the target frame rate of the image data to be displayed, the target intermediate frame images are generated based on the adjacent target original frame images, and the target intermediate frame images are inserted between the corresponding adjacent target original frame images, so that the refreshing frame rate of the desktop meets the preset requirement under the condition of transmitting the same image data, the smoothness of desktop image display is effectively improved, and the display is continuous and smooth.
The embodiment of the present application provides a possible implementation manner, in which as shown in fig. 3, generating a corresponding target intermediate frame image according to any two adjacent target original frame images includes:
in step S301, a target original frame image with a display time earlier in any two adjacent target original frame images is taken as a first target original frame image, and a target original frame image with a display time later is taken as a second target original frame image.
In the embodiment of the present application, the confirmation of the first and second target original frame images is only an expression, and it is not specifically specified which frame image is, as another implementation manner of the embodiment of the present application, that a target original frame image that is displayed earlier in time in any two adjacent target original frame images may be the second target original frame image, and a target original frame image that is displayed later in time may be the first target original frame image.
In step S302, a change area of the second target original frame image compared to the first target original frame image is determined.
In this embodiment, before generating the target intermediate frame image, a change area of the second target original frame image compared to the first target original frame image may be determined, where the change area refers to a change of a display element in the area, such as a change of a graphic of the display element, or a change of a position of the display element, for example, as shown in fig. 4, a mouse pointer 402 is located in the first target original frame image 401, the position of the mouse pointer is located in a lower left corner of the first target original frame image 401, and the mouse pointer is moved to an upper right corner in the second target original frame image 403, and then the change area in the second target original frame image 403 compared to the first target original frame image 401 may be determined as an original position 404 of the mouse pointer and a current position area 405 of the mouse pointer. As another embodiment of the present application, as shown in fig. 5, a square is displayed in the first target original frame image 501, and the square is changed into a circle in the second target original frame image 502, so that the changed area in the second target original frame image 502 can be determined as an area 503 where the square and the circle are located. Of course, in the embodiment of the present application, a window frame, a title bar, a menu, a button, a sub-window, a view, and the like may be changed.
In step S303, a target intermediate frame image is generated based on the change area and the first target original frame image.
In the embodiment of the present application, after determining the change area, a target intermediate frame image may be generated based on the change area and other areas in the first target original frame image, for example, a region of the first target original frame image corresponding to the change area is generated, and images of other areas in the first target original frame image are multiplexed to generate the target intermediate frame image.
As a possible implementation manner of the present application, for convenience of illustration, a specific embodiment is taken as an example, as shown in fig. 6, a changed region 602 exists in a first target original frame image 601, the changed region 602 is black in the first target original frame image, RGB values of pixel points of a whole image are all (0,0,0), and in the second target original frame image, the changed region is displayed as white, and RGB values of pixel points of the whole image are (255,255,255) based on the foregoing embodiment, a target intermediate frame image of the changed region is gray, and then the changed region is combined with other regions of the first target original frame image except for the RGB changed region to generate a target intermediate frame image 603, and the pixel points of the whole image are all (128,128,128).
According to the image processing method and device, the change areas in the two adjacent target original frame images are identified, only the change areas are changed, other parts are multiplexed, the workload of image processing can be greatly reduced, and the efficiency of image processing is improved.
The present application provides a possible implementation manner, in which as shown in fig. 7, determining a change area of the second target original frame image compared to the first target original frame image includes:
step S701, a first tree diagram and a second tree diagram are obtained, where the first tree diagram is used to represent the structural relationship of each display element in the first target original frame image, and the second tree diagram is used to represent the structural relationship of each display element in the second target original frame image.
In the embodiment of the present application, the tree graph is used to represent the structural relationship of each display element in the target original frame image, and each display element (such as a window frame, a title bar, a menu, a button, a sub-window, a view, etc.) in the target original frame image may be used as a node of the tree graph, where each node may further include graphic information of each display element, such as graphic information of an input device such as a mouse at the uppermost layer, i.e., graphic information such as a position, an image, an object ID (screen coordinate, texture buffer, texture ID), etc.; displaying a frame interface, a window display content position, an image and ID information by the APP application; and application logos of other desktops, pop-up windows and the like can also be included.
Step S702 is to determine a node of the second tree-like graph that changes compared to the first tree-like graph, and determine a display element of the second target original frame image that changes compared to the first target original frame image based on the changed node.
In the embodiment of the application, each node in the tree graph records the graph information of the corresponding display element, when the position or the graph of the display element changes, the information recorded by the corresponding node changes, and the changed display element can be determined by comparing the information recorded in each node in the first tree graph and the second tree graph.
In step S703, a change area is determined based on the changed display element.
For the embodiment of the present application, after the changed display element is determined, the changed region can be determined based on the region where the display element is located.
According to the method and the device, the changed display elements are determined according to the information recorded in the nodes in the tree graph by acquiring the tree graph, the areas where the changed display elements are located are determined as the change areas, and the change areas are determined accurately.
The embodiment of the present application provides a possible implementation manner, in which as shown in fig. 8, generating a target intermediate frame image based on a change area and a first target original frame image includes:
in step S801, the type of change of the display element in the change area is determined.
In the embodiment of the application, after the change area is determined, the type of the change of the display element in the change area may be determined, and then the target intermediate frame image may be generated based on the type.
In step S802, when the type of the change of the display element is a position change type, a target intermediate frame image is generated based on the position of the display element in the change area and the position of the display element in the first target original frame image.
In the embodiment of the present application, for convenience of illustration, taking the foregoing specific embodiment as an example, as shown in fig. 9, if there is a mouse pointer 902 in the first target original frame image 901, the position of the mouse pointer is in the lower left corner of the first target original frame image 901, and the mouse pointer is moved to the upper right corner in the second target original frame image 903, then it can be determined that the changed area in the second target original frame 903 compared to the first target original frame image 901 is the original position 904 of the mouse pointer and the current position area 905 of the mouse pointer, the changed display element is the mouse pointer, the change type is the position change type, then based on the change, the target intermediate frame image can be generated, wherein the mouse pointer in the target intermediate frame image can be in the middle of the window, and the coordinate of the position where the mouse pointer is located can be the average value of the coordinates of the front and rear two positions, as shown in figure 906.
In step S803, when the type of the display element being changed is the image change type, a target intermediate frame image is generated based on the image of the display element in the change area and the image of the display element in the first target original frame image.
In the embodiment of the present application, for convenience of description, taking the foregoing specific embodiment as an example, as shown in fig. 10, a square is displayed in a first target original frame image 1001, and in a second target original frame image 1002, the square is changed into a circle, a changed display element is a window of the graph, and the change type is an image change type, an image in an intermediate target frame image may be generated according to images before and after the change, as shown in fig. 1003, and optionally, both the images before and after the change may be displayed in a form of a dotted line.
According to the method and the device, the change type of the display element is determined, and the mode of generating the target intermediate frame image is determined based on different change types, so that the display effect is better.
The present application provides a possible implementation manner, in which as shown in fig. 11, determining at least two target original frame images in image data based on a relationship between a display frame rate and a preset target frame rate includes:
in step S1101, the number of target inter-frame images to be inserted per unit time is determined based on the relationship between the display frame rate and the preset target frame rate.
In the embodiment of the application, the display frame rate indicates the number of frames of images displayed per second of image data to be displayed, the target frame rate indicates the number of frames of images displayed per second required by the client, and the number of target intermediate frame images required to be inserted in a unit time can be calculated based on the target frame rate and the display frame rate. For example, if the display frame rate is 30 frames per second and the target frame rate is 60 frames per second, then 60-30 frames per second is calculated to be 30 frames per second, i.e. 30 target inter-frame images need to be inserted per second.
In step S1102, a target original frame image is determined based on the number of target intermediate frame images and the display frame rate.
In the embodiment of the present application, after the number of target intermediate frame images to be inserted in a unit time is determined, a target original frame image to be inserted may be calculated based on the number of target intermediate frame images and the display frame rate. For example, if the target intermediate frame image to be inserted in a unit time is 30 frames and the display frame rate is also 30 frames, all the original frame images are target original frame images. For more clearly explaining, taking another embodiment as an example, if the target intermediate frame image to be inserted in the unit time is 15 frames and the display frame rate is 60 frames, it indicates that the 15 frame images need to be uniformly inserted into the 60 frames, and one frame of target intermediate frame image is inserted into every 4 frames of original frame images, then the target original frame image can be determined to be 30 frames.
According to the method and the device, the target original frame image needing frame insertion operation is determined through the display frame rate and the target frame rate, the accuracy of target intermediate frame image insertion is guaranteed, and the smoothness of image display is guaranteed.
The embodiment of the present application provides a possible implementation manner, in which generating a corresponding target intermediate frame image according to any two adjacent target original frame images includes:
and mixing pixel points in any two adjacent target original frame images according to a preset proportion to generate a corresponding target intermediate frame image.
In this embodiment of the application, when a target intermediate frame image is generated according to two adjacent target original frame images, pixel points of the two target original frame images may be directly mixed, where the pixel points may be mixed according to a preset ratio, for example, 50% of the pixel points in the first target original frame image and 50% of the pixel points in the second target original frame image are mixed to generate the target intermediate frame image, and optionally, positions where 50% of the pixel points in the first target original frame image are located and positions where 50% of the pixel points in the first target original frame image are located are not repeated, and are relatively uniformly distributed.
According to the image processing method and device, the pixel points of the two adjacent target original frame images are mixed, the target intermediate frame image can be obtained quickly, and the image processing efficiency is high.
The embodiment of the present application provides a possible implementation manner, in which as shown in fig. 12, generating a corresponding target intermediate frame image according to any two adjacent target original frame images includes:
step S1201, calculating a weighted average value of each pixel point in two adjacent target original frame images according to a preset weight;
in step S1202, a target intermediate frame image is generated from the weighted average.
In the embodiment of the present application, when generating corresponding target intermediate frame images according to any two adjacent target original frame images, the target intermediate frame images may be generated according to pixel points in each target original frame image, and optionally, an average value of corresponding pixel points in each target original frame image may be calculated, and the target intermediate frame images may be generated according to the average value. For example, for convenience of explanation, taking a simple embodiment as an example, if each pixel point in the first target original frame image is 255, and each pixel point in the second target original frame image is 1, each pixel point in the target intermediate frame image is 178 through calculation. Of course, in an actual implementation process, each pixel point may be different, but the calculation method is the same as that in the embodiment of the present application, and also belongs to the protection scope of the present application.
According to the method and the device, the average value of the pixel points in the two adjacent target original frame images is calculated to serve as the pixel points in the target intermediate frame image, the target intermediate frame image is closer to the target original frame image, and the image transition is smoother.
According to the method and the device, the target original frame images needing frame interpolation are determined by acquiring the display frame rate and the target frame rate of the image data to be displayed, the target intermediate frame images are generated based on the adjacent target original frame images, and the target intermediate frame images are inserted between the corresponding adjacent target original frame images, so that the refreshing frame rate of the desktop meets the preset requirement under the condition of transmitting the same image data, the smoothness of desktop image display is effectively improved, and the display is continuous and smooth.
An embodiment of the present application provides an image processing apparatus, and as shown in fig. 13, the image processing apparatus 130 may include: an image data acquisition module 1310, a target original frame image determination module 1320, a target intermediate frame image generation module 1330, and a target image generation module 1340, wherein,
an image data obtaining module 1310, configured to obtain image data to be displayed, and determine a display frame rate of the image data;
a target original frame image determining module 1320, configured to determine at least two target original frame images in the image data based on a relationship between the display frame rate and a preset target frame rate;
a target intermediate frame image generating module 1330, configured to generate, for any two adjacent target original frame images in the at least two target original frame images, a corresponding target intermediate frame image according to any two adjacent target original frame images;
the target image generating module 1340 is configured to insert the target intermediate frame image between any two corresponding adjacent target original frame images, so as to generate target image data.
Optionally, the target intermediate frame image generating module 1330 may be configured to, when generating corresponding target intermediate frame images according to any two adjacent target original frame images:
the method comprises the steps that a target original frame image with the front display time in any two adjacent target original frame images is taken as a first target original frame image, and a target original frame image with the back display time is taken as a second target original frame image;
determining a change area of the second target original frame image compared with the first target original frame image;
a target intermediate frame image is generated based on the changed region and the first target original frame image.
Optionally, the target intermediate frame image generation module 1330, when determining the changed area of the second target original frame image compared to the first target original frame image, may be configured to:
acquiring a first tree diagram and a second tree diagram, wherein the first tree diagram is used for representing the structural relationship of each display element in a first target original frame image, and the second tree diagram is used for representing the structural relationship of each display element in a second target original frame image;
determining a node of the second dendrogram which changes compared with the first dendrogram, and determining a display element of the second target original frame image which changes compared with the first target original frame image based on the changed node;
the changed region is determined based on the changed display element.
Optionally, the target intermediate frame image generating module 1330 may be configured to, when generating the target intermediate frame image based on the change area and the first target original frame image:
determining the type of the display element changed in the change area;
when the type of the change of the display element is a position change type, generating a target intermediate frame image based on the position of the display element in the change area and the position of the display element in the first target original frame image;
when the type of the change of the display element is an image change type, a target intermediate frame image is generated based on the image of the display element in the change area and the image of the display element in the first target original frame image.
Optionally, the target original frame image determining module 1320 determines at least two target original frame images in the image data based on a relationship between the display frame rate and a preset target frame rate, including:
determining the number of target intermediate frame images required to be inserted in unit time based on the relation between the display frame rate and a preset target frame rate;
and determining the target original frame images based on the number of the target intermediate frame images and the display frame rate.
Optionally, the target intermediate frame image generating module 1330 may be configured to, when generating corresponding target intermediate frame images according to any two adjacent target original frame images:
and mixing pixel points in any two adjacent target original frame images according to a preset proportion to generate a corresponding target intermediate frame image.
Optionally, the target intermediate frame image generating module 1330 may be configured to, when generating corresponding target intermediate frame images according to any two adjacent target original frame images:
calculating the weighted average value of each pixel point in two adjacent target original frame images according to a preset weight;
and generating the target intermediate frame image according to the weighted average value.
The image processing apparatus according to the embodiment of the present application can execute the image processing method according to the foregoing embodiment of the present application, and the implementation principles thereof are similar, and are not described herein again.
According to the method and the device, the target original frame images needing frame interpolation are determined by acquiring the display frame rate and the target frame rate of the image data to be displayed, the target intermediate frame images are generated based on the adjacent target original frame images, and the target intermediate frame images are inserted between the corresponding adjacent target original frame images, so that the refreshing frame rate of the desktop meets the preset requirement under the condition of transmitting the same image data, the smoothness of desktop image display is effectively improved, and the display is continuous and smooth.
An embodiment of the present application provides an electronic device, including: a memory and a processor; at least one program stored in the memory for execution by the processor, which when executed by the processor, implements: according to the method and the device, the target original frame images needing frame interpolation are determined by acquiring the display frame rate and the target frame rate of the image data to be displayed, the target intermediate frame images are generated based on the adjacent target original frame images, and the target intermediate frame images are inserted between the corresponding adjacent target original frame images, so that the refreshing frame rate of the desktop meets the preset requirement under the condition of transmitting the same image data, the smoothness of desktop image display is effectively improved, and the display is continuous and smooth.
In an alternative embodiment, there is provided an electronic device, as shown in fig. 14, the electronic device 14000 shown in fig. 14 comprising: a processor 14001, and a memory 14003. Among other things, processor 14001 is coupled to memory 14003, such as via bus 14002. Optionally, the electronic device 14000 can also include a transceiver 14004. It should be noted that the transceiver 14004 is not limited to one in practical application, and the structure of the electronic device 14000 does not constitute a limitation to the embodiment of the present application.
The Processor 14001 may be a CPU (Central Processing Unit), a general purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. Processor 14001 may also be a combination of computing functions, e.g., a combination comprising one or more microprocessors, a combination of a DSP and a microprocessor, or the like.
The Memory 14003 can be a ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, a RAM (Random Access Memory) or other type of dynamic storage device that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these.
The memory 14003 is used for storing application code that implements the subject application, and is controlled for execution by the processor 14001. Processor 14001 is configured to execute application program code stored in memory 14003 to implement the content shown in the foregoing method embodiments.
The electronic device includes, but is not limited to, a computer, a mobile phone, a tablet computer, and the like.
The present application provides a computer-readable storage medium, on which a computer program is stored, which, when running on a computer, enables the computer to execute the corresponding content in the foregoing method embodiments. Compared with the prior art, the method and the device have the advantages that the target original frame images needing frame interpolation are determined by acquiring the display frame rate and the target frame rate of the image data to be displayed, the target intermediate frame images are generated based on the adjacent target original frame images, and the target intermediate frame images are inserted between the corresponding adjacent target original frame images, so that the refreshing frame rate of a desktop can meet the preset requirement under the condition of transmitting the same image data, the smoothness of desktop image display is effectively improved, and the display is continuous and smooth.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, several modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.
Claims (10)
1. An image processing method, comprising:
acquiring image data to be displayed, and determining the display frame rate of the image data;
determining at least two target original frame images in the image data based on a relation between the display frame rate and a preset target frame rate;
generating corresponding target intermediate frame images according to any two adjacent target original frame images in the at least two target original frame images;
and inserting the target intermediate frame image between any two corresponding adjacent target original frame images to generate target image data.
2. The image processing method according to claim 1, wherein the generating of the corresponding target intermediate frame image from the any two adjacent target original frame images comprises:
the target original frame image with the front display time in any two adjacent target original frame images is taken as a first target original frame image, and the target original frame image with the back display time is taken as a second target original frame image;
determining a change area of the second target original frame image compared to the first target original frame image;
generating the target intermediate frame image based on the change region and the first target original frame image.
3. The image processing method according to claim 2, wherein determining a changed region of the second target original frame image compared to the first target original frame image comprises:
acquiring a first tree diagram and a second tree diagram, wherein the first tree diagram is used for representing the structural relationship of each display element in the first target original frame image, and the second tree diagram is used for representing the structural relationship of each display element in the second target original frame image;
determining a node of the second dendrogram that changes compared to the first dendrogram, and determining a display element of the second target original frame image that changes compared to the first target original frame image based on the changed node;
determining the changed region based on the changed display element.
4. The image processing method according to claim 2, wherein the generating the target intermediate frame image based on the change region and the first target original frame image comprises:
determining the type of the display element changed in the change area;
when the type of the change of the display element is a position change type, generating the target intermediate frame image based on the position of the display element in the change area and the position of the display element in the first target original frame image;
when the type of the change of the display element is an image change type, generating the target intermediate frame image based on the image of the display element in the change area and the image of the display element in the first target original frame image.
5. The image processing method according to claim 1, wherein the determining at least two target original frame images in the image data based on a relationship between the display frame rate and a preset target frame rate comprises:
determining the number of target intermediate frame images required to be inserted in unit time based on the relation between the display frame rate and a preset target frame rate;
determining a target original frame image based on the number of target intermediate frame images and the display frame rate.
6. The image processing method according to claim 1, wherein the generating of the corresponding target intermediate frame image from the any two adjacent target original frame images comprises:
and mixing the pixel points in any two adjacent target original frame images according to a preset proportion to generate a corresponding target intermediate frame image.
7. The image processing method according to claim 1, wherein the generating of the corresponding target intermediate frame image from the any two adjacent target original frame images comprises:
calculating the weighted average value of each pixel point in two adjacent target original frame images according to a preset weight;
and generating a target intermediate frame image according to the weighted average value.
8. An image processing apparatus characterized by comprising:
the image data acquisition module is used for acquiring image data to be displayed and determining the display frame rate of the image data;
a target original frame image determining module, configured to determine at least two target original frame images in the image data based on a relationship between the display frame rate and a preset target frame rate;
a target intermediate frame image generating module, configured to generate, for any two adjacent target original frame images in the at least two target original frame images, corresponding target intermediate frame images according to the any two adjacent target original frame images;
and the target image generation module is used for inserting the target intermediate frame image between any two corresponding adjacent target original frame images to generate target image data.
9. An electronic device, comprising:
a processor, a memory, and a bus;
the bus is used for connecting the processor and the memory;
the memory is used for storing operation instructions;
the processor is used for executing the image processing method of any one of the claims 1 to 7 by calling the operation instruction.
10. A computer readable storage medium, characterized in that the storage medium stores at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by the processor to implement the image processing method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011594887.3A CN112596843B (en) | 2020-12-29 | 2020-12-29 | Image processing method, device, electronic equipment and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011594887.3A CN112596843B (en) | 2020-12-29 | 2020-12-29 | Image processing method, device, electronic equipment and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112596843A true CN112596843A (en) | 2021-04-02 |
CN112596843B CN112596843B (en) | 2023-07-25 |
Family
ID=75203562
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011594887.3A Active CN112596843B (en) | 2020-12-29 | 2020-12-29 | Image processing method, device, electronic equipment and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112596843B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113225546A (en) * | 2021-04-25 | 2021-08-06 | Oppo广东移动通信有限公司 | Color temperature adjusting method and device, electronic equipment and computer readable storage medium |
CN113837136A (en) * | 2021-09-29 | 2021-12-24 | 深圳市慧鲤科技有限公司 | Video frame insertion method and device, electronic equipment and storage medium |
CN114025105A (en) * | 2021-12-15 | 2022-02-08 | 北京达佳互联信息技术有限公司 | Video processing method and device, electronic equipment and storage medium |
CN114205648A (en) * | 2021-12-07 | 2022-03-18 | 网易(杭州)网络有限公司 | Frame interpolation method and device |
CN114490671A (en) * | 2022-03-31 | 2022-05-13 | 北京华建云鼎科技股份公司 | Client-side same-screen data synchronization system |
WO2024104439A1 (en) * | 2022-11-17 | 2024-05-23 | 歌尔科技有限公司 | Image frame interpolation method and apparatus, device, and computer readable storage medium |
CN114205648B (en) * | 2021-12-07 | 2024-06-04 | 网易(杭州)网络有限公司 | Frame inserting method and device |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6323856B1 (en) * | 1998-03-19 | 2001-11-27 | Shmuel Banitt | Method for processing variable speed scenes for computer games |
CN105760132A (en) * | 2016-02-03 | 2016-07-13 | 广东欧珀移动通信有限公司 | Method, device and mobile device for achieving frame rate dynamic refreshing |
CN109348124A (en) * | 2018-10-23 | 2019-02-15 | Oppo广东移动通信有限公司 | Image transfer method, device, electronic equipment and storage medium |
CN109379625A (en) * | 2018-11-27 | 2019-02-22 | Oppo广东移动通信有限公司 | Method for processing video frequency, device, electronic equipment and computer-readable medium |
CN110267098A (en) * | 2019-06-28 | 2019-09-20 | 连尚(新昌)网络科技有限公司 | A kind of method for processing video frequency and terminal |
CN111107427A (en) * | 2019-11-20 | 2020-05-05 | Oppo广东移动通信有限公司 | Image processing method and related product |
CN111323775A (en) * | 2020-01-19 | 2020-06-23 | 上海眼控科技股份有限公司 | Image processing method, image processing device, computer equipment and storage medium |
CN111741303A (en) * | 2020-06-09 | 2020-10-02 | Oppo广东移动通信有限公司 | Deep video processing method and device, storage medium and electronic equipment |
CN111862183A (en) * | 2020-07-02 | 2020-10-30 | Oppo广东移动通信有限公司 | Depth image processing method and system, electronic device and storage medium |
CN111918066A (en) * | 2020-09-08 | 2020-11-10 | 北京字节跳动网络技术有限公司 | Video encoding method, device, equipment and storage medium |
-
2020
- 2020-12-29 CN CN202011594887.3A patent/CN112596843B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6323856B1 (en) * | 1998-03-19 | 2001-11-27 | Shmuel Banitt | Method for processing variable speed scenes for computer games |
CN105760132A (en) * | 2016-02-03 | 2016-07-13 | 广东欧珀移动通信有限公司 | Method, device and mobile device for achieving frame rate dynamic refreshing |
CN109348124A (en) * | 2018-10-23 | 2019-02-15 | Oppo广东移动通信有限公司 | Image transfer method, device, electronic equipment and storage medium |
CN109379625A (en) * | 2018-11-27 | 2019-02-22 | Oppo广东移动通信有限公司 | Method for processing video frequency, device, electronic equipment and computer-readable medium |
CN110267098A (en) * | 2019-06-28 | 2019-09-20 | 连尚(新昌)网络科技有限公司 | A kind of method for processing video frequency and terminal |
CN111107427A (en) * | 2019-11-20 | 2020-05-05 | Oppo广东移动通信有限公司 | Image processing method and related product |
CN111323775A (en) * | 2020-01-19 | 2020-06-23 | 上海眼控科技股份有限公司 | Image processing method, image processing device, computer equipment and storage medium |
CN111741303A (en) * | 2020-06-09 | 2020-10-02 | Oppo广东移动通信有限公司 | Deep video processing method and device, storage medium and electronic equipment |
CN111862183A (en) * | 2020-07-02 | 2020-10-30 | Oppo广东移动通信有限公司 | Depth image processing method and system, electronic device and storage medium |
CN111918066A (en) * | 2020-09-08 | 2020-11-10 | 北京字节跳动网络技术有限公司 | Video encoding method, device, equipment and storage medium |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113225546A (en) * | 2021-04-25 | 2021-08-06 | Oppo广东移动通信有限公司 | Color temperature adjusting method and device, electronic equipment and computer readable storage medium |
CN113837136A (en) * | 2021-09-29 | 2021-12-24 | 深圳市慧鲤科技有限公司 | Video frame insertion method and device, electronic equipment and storage medium |
CN113837136B (en) * | 2021-09-29 | 2022-12-23 | 深圳市慧鲤科技有限公司 | Video frame insertion method and device, electronic equipment and storage medium |
CN114205648A (en) * | 2021-12-07 | 2022-03-18 | 网易(杭州)网络有限公司 | Frame interpolation method and device |
CN114205648B (en) * | 2021-12-07 | 2024-06-04 | 网易(杭州)网络有限公司 | Frame inserting method and device |
CN114025105A (en) * | 2021-12-15 | 2022-02-08 | 北京达佳互联信息技术有限公司 | Video processing method and device, electronic equipment and storage medium |
CN114025105B (en) * | 2021-12-15 | 2023-11-28 | 北京达佳互联信息技术有限公司 | Video processing method, device, electronic equipment and storage medium |
CN114490671A (en) * | 2022-03-31 | 2022-05-13 | 北京华建云鼎科技股份公司 | Client-side same-screen data synchronization system |
CN114490671B (en) * | 2022-03-31 | 2022-07-29 | 北京华建云鼎科技股份公司 | Client-side same-screen data synchronization system |
WO2024104439A1 (en) * | 2022-11-17 | 2024-05-23 | 歌尔科技有限公司 | Image frame interpolation method and apparatus, device, and computer readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN112596843B (en) | 2023-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112596843A (en) | Image processing method, image processing device, electronic equipment and computer readable storage medium | |
CN110377263B (en) | Image synthesis method, image synthesis device, electronic equipment and storage medium | |
CN110377264B (en) | Layer synthesis method, device, electronic equipment and storage medium | |
CN110377257B (en) | Layer composition method and device, electronic equipment and storage medium | |
CN112614202B (en) | GUI rendering display method, terminal, server, electronic equipment and storage medium | |
WO2020108082A1 (en) | Video processing method and device, electronic equipment and computer readable medium | |
CN110989878B (en) | Animation display method and device in applet, electronic equipment and storage medium | |
CN110363831B (en) | Layer composition method and device, electronic equipment and storage medium | |
WO2021008427A1 (en) | Image synthesis method and apparatus, electronic device, and storage medium | |
US7616220B2 (en) | Spatio-temporal generation of motion blur | |
WO2018000372A1 (en) | Picture display method and terminal | |
WO2018120992A1 (en) | Window rendering method and terminal | |
US8780120B2 (en) | GPU self throttling | |
CN113657518B (en) | Training method, target image detection method, device, electronic device, and medium | |
US11810524B2 (en) | Virtual reality display device and control method thereof | |
CN109091866B (en) | Display control method and device, computer readable medium and electronic equipment | |
CN109859328B (en) | Scene switching method, device, equipment and medium | |
CN115861510A (en) | Object rendering method, device, electronic equipment, storage medium and program product | |
CN115775204A (en) | Image super-resolution method, device, server and storage medium | |
CN113836455A (en) | Special effect rendering method, device, equipment, storage medium and computer program product | |
CN111243069A (en) | Scene switching method and system of Unity3D engine | |
CA2741743C (en) | Hardware accelerated caret rendering | |
CN113691866B (en) | Video processing method, device, electronic equipment and medium | |
Anholt | High Performance X Servers in the Kdrive Architecture. | |
KR101416106B1 (en) | Combination-type rendering method for improving user responsiveness, and computer-readable recording medium with rendering program for the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |