CN112596843B - Image processing method, device, electronic equipment and computer readable storage medium - Google Patents

Image processing method, device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN112596843B
CN112596843B CN202011594887.3A CN202011594887A CN112596843B CN 112596843 B CN112596843 B CN 112596843B CN 202011594887 A CN202011594887 A CN 202011594887A CN 112596843 B CN112596843 B CN 112596843B
Authority
CN
China
Prior art keywords
target
image
original frame
frame image
target original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011594887.3A
Other languages
Chinese (zh)
Other versions
CN112596843A (en
Inventor
奚智
姜哲
邹仕洪
张广伟
黄浩东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yuanxin Technology
Original Assignee
Yuanxin Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yuanxin Technology filed Critical Yuanxin Technology
Priority to CN202011594887.3A priority Critical patent/CN112596843B/en
Publication of CN112596843A publication Critical patent/CN112596843A/en
Application granted granted Critical
Publication of CN112596843B publication Critical patent/CN112596843B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/08Protocols specially adapted for terminal emulation, e.g. Telnet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides an image processing method, an image processing device, electronic equipment and a computer readable storage medium, and relates to the technical field of image processing. The method comprises the following steps: determining a display frame rate of the image data, and determining a target original frame image according to the target frame rate; generating a corresponding target intermediate frame image according to the target original frame image; and inserting the target intermediate frame image between the target original frame images to generate target image data. According to the method and the device for displaying the image data, the display frame rate and the target frame rate of the image data to be displayed are obtained, the target original frame image needing to be subjected to frame insertion is determined, the target intermediate frame image is generated based on the adjacent target original frame image, and the target intermediate frame image is inserted between the corresponding adjacent target original frame images, so that the refresh frame rate of the desktop can meet the preset requirement under the condition of transmitting the same image data, the display fluency of the desktop image is effectively improved, and the display is more continuous and smooth.

Description

Image processing method, device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing device, an electronic device, and a computer readable storage medium.
Background
Cloud desktop technology typically virtualizes a user's desktop via a cloud platform (server side), and the user connects to the virtual desktop via a related protocol of the client device, with no substantial difference in user experience between using the virtual desktop and using a traditional local desktop. Cloud desktop technology greatly reduces the requirements for hardware performance of client devices, which can be simple portable devices such as thin clients, tablets, mobile phones and the like. From the perspective of enterprises, the cloud desktop can realize that all data are stored on a cloud platform which is strictly controlled, so that information security is ensured.
In cloud desktops and similar scenarios, the smoothness of the desktop determines the user's experience. At present, methods mainly adopted for improving the smoothness of the desktop are as follows: the method for compressing graphic data is to install a high-performance display card on a server, directly map the high-performance display card to a virtual machine, then directly display the high-performance display card through a client, and the like. However, these methods rely on internet transmission performance, uncertainty of network transmission, graphics data processing speed and other factors may cause problems, and are mainly represented by that a client desktop is jammed or a dynamic image is torn in a using process.
Therefore, in the existing cloud desktop technology, in order to improve the smoothness of the desktop, a high-performance display card depending on a server is required, the requirement on the network speed is high, and the problems of jamming or tearing of dynamic images and the like are easy to occur, so that improvement is required.
Disclosure of Invention
The present application aims to solve at least one of the above technical drawbacks, and particularly to a technical drawback that in the existing cloud desktop technology, in order to improve the smoothness of the desktop, a high-performance graphics card is required to be relied on a server, the network speed requirement is high, and problems of jamming and dynamic image tearing are easy to occur.
In a first aspect, there is provided an image processing method, the method comprising:
acquiring image data to be displayed, and determining the display frame rate of the image data;
determining at least two target original frame images in the image data based on a relation between the display frame rate and a preset target frame rate;
for any two adjacent target original frame images in the at least two target original frame images, generating corresponding target intermediate frame images according to the any two adjacent target original frame images;
and inserting the target intermediate frame image between any two corresponding adjacent target original frame images to generate target image data.
As an optional embodiment of the present application, generating a corresponding target intermediate frame image according to any two adjacent target original frame images includes:
the method comprises the steps that a target original frame image with the front display time in any two adjacent target original frame images is taken as a first target original frame image, and a target original frame image with the rear display time is taken as a second target original frame image;
determining a change area of the second target original frame image compared with the first target original frame image;
a target intermediate frame image is generated based on the change region and the first target original frame image.
As an optional embodiment of the present application, determining the change area of the second target original frame image compared to the first target original frame image includes:
acquiring a first tree diagram and a second tree diagram, wherein the first tree diagram is used for representing the construction relation of each display element in a first target original frame image, and the second tree diagram is used for representing the construction relation of each display element in a second target original frame image;
determining a node which is changed compared with the first tree diagram in the second tree diagram, and determining a display element which is changed compared with the first target original frame image in the second target original frame image based on the transformed node;
A change region is determined based on the changed display element.
As an optional embodiment of the present application, generating the target intermediate frame image based on the change region and the first target original frame image includes:
determining the type of the change of the display element in the change area;
when the type of the display element change is a position change type, generating a target intermediate frame image based on the position of the display element in the change area and the position of the display element in the first target original frame image;
when the type of the display element change is an image change type, a target intermediate frame image is generated based on the image of the display element in the change region and the image of the display element in the first target original frame image.
As an optional embodiment of the present application, determining at least two target original frame images in the image data based on a relationship between the display frame rate and a preset target frame rate includes:
determining the number of target intermediate frame images to be inserted in a unit time based on a relation between a display frame rate and a preset target frame rate;
and determining a target original frame image based on the number of target intermediate frame images and the display frame rate.
As an optional embodiment of the present application, generating a corresponding target intermediate frame image according to any two adjacent target original frame images includes:
And mixing pixel points in any two adjacent target original frame images according to a preset proportion to generate corresponding target intermediate frame images.
As an optional embodiment of the present application, generating a corresponding target intermediate frame image according to any two adjacent target original frame images includes:
calculating a weighted average value of each pixel point in two adjacent target original frame images according to preset weights;
and generating a target intermediate frame image according to the weighted average value.
In a second aspect, there is provided an image processing apparatus comprising:
the image data acquisition module is used for acquiring image data to be displayed and determining the display frame rate of the image data;
the target original frame image determining module is used for determining at least two target original frame images in the image data based on the relation between the display frame rate and the preset target frame rate;
the target intermediate frame image generation module is used for generating corresponding target intermediate frame images according to any two adjacent target original frame images in at least two target original frame images;
and the target image generation module is used for inserting the target intermediate frame image between any two corresponding adjacent target original frame images to generate target image data.
In a third aspect, an electronic device is provided, the electronic device comprising:
a processor, a memory, and a bus;
a bus for connecting the processor and the memory;
a memory for storing operation instructions;
and the processor is used for executing the image processing method by calling the operation instruction.
In a fourth aspect, a computer readable storage medium is provided, the storage medium storing at least one instruction, at least one program, code set, or instruction set, the at least one instruction, at least one program, code set, or instruction set being loaded by a processor and performing the image processing method described above.
According to the method and the device, the display frame rate and the target frame rate of the image data to be displayed are obtained, the target original frame image needing to be subjected to frame insertion is determined, the target intermediate frame image is generated based on the adjacent target original frame image, the target intermediate frame image is inserted between the corresponding adjacent target original frame images, the refreshing frame rate of the desktop can meet the preset requirement under the condition of transmitting the same graphic data, the display smoothness of the desktop image is effectively improved, and the display is more continuous and smooth.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings that are required to be used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of generating a target intermediate frame image according to an embodiment of the present application;
fig. 3 is a schematic flow chart of a target intermediate frame image generating method according to an embodiment of the present application;
fig. 4 is a schematic diagram of a display element position change according to an embodiment of the present application;
fig. 5 is a schematic diagram of a change of a display element graph according to an embodiment of the present application;
fig. 6 is a schematic diagram of a superposition generating target intermediate frame image according to an embodiment of the present application;
FIG. 7 is a flowchart of a method for determining a change region according to an embodiment of the present disclosure;
fig. 8 is a flowchart of a method for generating a target intermediate frame image according to a change type according to an embodiment of the present application;
fig. 9 is a schematic diagram of a position change generation target intermediate frame image according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a graphical change generation target intermediate frame image according to an embodiment of the present disclosure;
fig. 11 is a flowchart of a method for determining an original frame image of a target according to an embodiment of the present application;
FIG. 12 is a flowchart illustrating a method for generating a target intermediate frame image based on pixel point average values according to an embodiment of the present disclosure;
Fig. 13 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
The above and other features, advantages, and aspects of embodiments of the present application will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of illustrating the present application and are not to be construed as limiting the present application.
As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein includes all or any element and all combination of one or more of the associated listed items.
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Several terms which are referred to in this application are first introduced and explained:
(1) The display frame rate is the frequency of the bitmap image which takes the frame as a unit and continuously appears on the display, and is used for representing the number of image frames displayed in the unit time in the desktop display picture;
(2) The target original frame image is an image corresponding to image data to be displayed, wherein a target intermediate frame image is needed to be inserted between two adjacent target original frame images;
(3) The target intermediate frame image is an image generated according to two adjacent target original frame images and is used for being inserted between the two adjacent target original frame images;
(4) The tree diagram is a tree structure with each display element (such as window frame, title bar, menu, button, sub-window, view, etc.) in the image as a node, and each node records the graphic information and the position information of the display element corresponding to the node.
The image processing method provided by the embodiment of the application can be applied to Yu Duan cloud GUI (Graphical User Interface ) cooperatively rendered scenes. The existing collaborative rendering of the end cloud GUI can be divided into two ways: the first way is: the server side is responsible for all desktop GUI rendering work, the rendered desktop frames are transmitted to the client side through the Internet and can be directly displayed without rendering operation; the second mode is as follows: the client side participates in the rendering work of the desktop, the server side transmits related instructions, graphic primitives and related graphic resources (such as textures and the like) data to the client side synchronously through the Internet, the client side cooperates with the server side to jointly complete the rendering operation of the desktop frame image, and the client side displays the final desktop frame image. In the original desktop frame image of the final client obtained in the two modes, the display frame rate may not reach the requirement of the target frame rate, so that the display picture of the client is blurred. In order to solve the above problems, a high-performance display card is installed on a server side by a graphic data compression method, and is directly mapped to a virtual machine and then directly displayed by a client side. However, the above method relies on internet transmission performance, uncertainty of network transmission, graphics data processing speed and other factors to cause problems, and is mainly represented by that a client desktop is stuck or a dynamic image is torn in the use process.
The image processing method, the image processing device, the electronic equipment and the computer readable storage medium aim to solve the technical problems in the prior art.
The following describes the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
An embodiment of the present application provides an image processing method, as shown in fig. 1, including:
step S101, obtaining image data to be displayed, and determining a display frame rate of the image data.
In this embodiment of the present application, the image data to be displayed refers to image data, such as video data, that is sent by a server and needs to be displayed at a client; the server is a server, or a plurality of servers, or a virtualized platform, or a cloud computing service center, and the client is a terminal device with a video playing function, for example, the user terminal can be a mobile phone, a tablet computer, an electronic book reader, smart glasses, a smart watch, a laptop portable computer, a desktop computer and the like. After the image data to be displayed is acquired, the display frame rate of the image data needs to be confirmed, and it can be understood that the display frame rate refers to the display frame rate of the image data after the client receives the image data sent by the server. For example, the server transmits image data to be displayed to the client, wherein the frame rate of the image data to be displayed is i 30 frames per second.
Step S102, at least two target original frame images in the image data are determined based on the relation between the display frame rate and the preset target frame rate.
In this embodiment of the present application, the preset target frame rate may be set by the user at the client, or may be set by the server, after the target frame rate is set, the client needs to display the image data according to the target frame rate after receiving the image data to be displayed sent by the server, but the display frame rate of the image data may not meet the requirement of the target frame rate, at this time, frame interpolation needs to be performed on the image data, and before frame interpolation, at least two target original frame images of the image data need to be determined based on the relationship between the display frame rate and the target frame rate. For example. The preset target frame rate is 60 frames per second, a frame of target intermediate frame image is needed to be inserted into two non-adjacent original frame images so as to meet the preset target frame rate, and all original frame images in the image data with display are target original frame images.
Step S103, for any two adjacent target original frame images in the at least two target original frame images, generating corresponding target intermediate frame images according to the any two adjacent target original frame images.
In the implementation of the application, for any two adjacent target original frame images, the target intermediate frame image inserted in the middle is generated according to the any two adjacent target original frame images, where the generation of the target intermediate frame image may be determined according to the performance requirement of the client, for example, when the hardware performance of the client is poor or the requirement on image display is low, a transparency-based frame blending algorithm may be adopted to blend the two adjacent target original frame images based on a transparency ratio, for example, 50% of each pixel in the two adjacent target original frame images is weighted and blended to obtain the target intermediate frame image; or when the requirement of the client on image display is high, the image can be processed by adopting a motion compensation algorithm and an optical flow method based on motion recognition, the motion recognition is carried out on the object which is changed in the adjacent two target original frame images or the change trend of each pixel point is calculated, and the target intermediate frame image is obtained by calculation, so that a better frame inserting effect is obtained.
For the embodiment of the present application, for convenience of explanation, taking a specific embodiment as an example, as shown in fig. 2, two adjacent target original frame images are 201 and 202 respectively, as a more convenient embodiment for description, the target original frame image 201 is a black image, all the RGB values of all the pixels of the whole image are (0, 0) the target original frame image 202 is a white image, all the RGB values of all the pixels of the whole image are (255 ), when the intermediate target frame image 203 is generated according to the target original frame image 201 and the target original frame image 202, the RGB values of each pixel in the target original frame image 201 and the target original frame image 202 can be averaged to obtain the gray target intermediate frame image 203, and all the RGB values of all the pixels of the whole image are (128,128,128). Of course, this embodiment is merely an embodiment that is convenient for description, and in practical implementation, the color of the image may be more abundant.
Step S104, inserting the target intermediate frame image between any two corresponding adjacent target original frame images to generate target image data.
In the embodiment of the application, after generating the target intermediate frame image, the target intermediate frame image is inserted between two corresponding adjacent target original frame images to generate target image data, wherein the display frame rate of the target image data is the same as the preset target frame rate.
According to the method and the device for displaying the image data, the display frame rate and the target frame rate of the image data to be displayed are obtained, the target original frame image needing to be subjected to frame insertion is determined, the target intermediate frame image is generated based on the adjacent target original frame image, the target intermediate frame image is inserted between the corresponding adjacent target original frame images, the refreshing frame rate of the desktop can meet the preset requirement when the same image data are transmitted, the display smoothness of the desktop image is effectively improved, and the display is more continuous and smooth.
The embodiment of the application provides a possible implementation manner, in this implementation manner, as shown in fig. 3, a corresponding target intermediate frame image is generated according to any two adjacent target original frame images, including:
In step S301, the target original frame image with the front display time in any two adjacent target original frame images is taken as the first target original frame image, and the target original frame image with the rear display time is taken as the second target original frame image.
For the embodiment of the present application, the confirmation of the first and second target original frame images is only one expression mode, and does not specify which frame image is specified, and as another implementation mode of the embodiment of the present application, the target original frame image with the front display time in any two adjacent target original frame images may be the second target original frame image, and the target original frame image with the rear display time may be the first target original frame image.
In step S302, a change area of the second target original frame image compared to the first target original frame image is determined.
In this embodiment of the present application, before the target intermediate frame image is generated, a change area of the second target original frame image compared with the first target original frame image may be determined, where the change area refers to a change of a display element in the area, may be a change of a graph of the display element, or may be a change of a position of the display element, and as shown in fig. 4, for example, a display window is taken as an example, in the first target original frame image 401, where a mouse pointer 402 is located in a lower left corner of the first target original frame image 401, and in the second target original frame image 403, where the mouse pointer moves to an upper right corner, it may be determined that a change area in the second target original frame image 403 compared with a change area in the first target original frame image 401 is an original position 404 of the mouse pointer and a current position area 405 of the mouse pointer. As another embodiment of the present application, as shown in fig. 5, a square is displayed in the first target original frame image 501, and in the second target original frame image 502, the square becomes a circle, and then it may be determined that the change area in the second target original frame image 502 is an area 503 where the square and the circle are located. Of course, in the embodiment of the present application, what changes may also be a window frame, a title bar, a menu, buttons, sub-windows, views, and the like.
Step S303, generating a target intermediate frame image based on the change region and the first target original frame image.
In the embodiment of the present application, after determining the change region, the target intermediate frame image may be generated based on the change region and other regions in the first target original frame image, for example, a region where a signal is generated according to the region of the first target original frame image corresponding to the change region, and the images of the other regions in the first target original frame image are multiplexed to generate the target intermediate frame image.
As a possible implementation manner of the present application, for convenience of explanation, taking a specific example as shown in fig. 6, a change area 602 exists in the first target original frame image 601, the change area 602 is black when the first target original frame image is the first target original frame image, the RGB values of the pixels of the full image are all (0, 0), the RGB values of the pixels of the full image are displayed as white when the second target original frame image is the second target original frame image, the RGB values of the pixels of the full image are (255 ) are based on the foregoing example, the target intermediate frame image of the change area is gray, and then the change area is combined with other areas except for the change area in the first target original frame image to generate a target intermediate frame image 603, and the RGB values of the pixels of the full image are all (128,128,128).
According to the method and the device for processing the image, the changing areas in the adjacent two target original frame images are identified, only the changing areas are changed, and other parts are multiplexed, so that the workload of image processing can be greatly reduced, and the efficiency of image processing is improved.
The embodiment of the present application provides a possible implementation manner, in which, as shown in fig. 7, determining a change area of the second target original frame image compared to the first target original frame image includes:
step S701, a first tree diagram and a second tree diagram are obtained, wherein the first tree diagram is used for representing the structural relationship of each display element in the first target original frame image, and the second tree diagram is used for representing the structural relationship of each display element in the second target original frame image.
In this embodiment of the present application, the tree diagram is used to represent a construction relationship of each display element in the target original frame image, and each display element (such as a window frame, a title bar, a menu, a button, a child window, a view, etc.) in the target original frame image may be used as a node of the tree diagram, where each node may further include graphic information of each display element, such as graphic information of an input device, such as a top mouse, that is, graphic information of a position, an image, an object ID (screen coordinates, a texture buffer, a texture ID), and the like; APP displays the frame interface, the position of window display content, image and ID information; other desktop application logo, popup windows, etc. may also be included.
In step S702, a node of the second tree diagram, which is changed compared to the first tree diagram, is determined, and a display element of the second target original frame image, which is changed compared to the first target original frame image, is determined based on the transformed node.
In this embodiment of the present application, each node in the tree diagram records graphic information of a corresponding display element, and when a position or a graphic of the display element changes, the information recorded in the corresponding node changes, and the changed display element can be determined by comparing the information recorded in each node in the first tree diagram and the second tree diagram.
Step S703, a change region is determined based on the changed display element.
For the embodiment of the application, after the display element which has changed is determined, the change area can be determined based on the area where the display element is located.
According to the embodiment of the application, the changed display elements are determined according to the information recorded in the nodes in the tree diagram by acquiring the tree diagram, and the area where the changed display elements are located is determined to be a changed area, so that the determination of the changed area is accurate.
The embodiment of the application provides a possible implementation manner, in this implementation manner, as shown in fig. 8, generating a target intermediate frame image based on a change area and a first target original frame image includes:
In step S801, the type of change of the display element in the change area is determined.
In the embodiment of the present application, after determining the change region, the type of change of the display element that changes in the change region may be determined first, and then the target intermediate frame image may be generated based on the type.
In step S802, when the type of the display element change is the position change type, a target intermediate frame image is generated based on the position of the display element in the change region and the position of the display element in the first target original frame image.
In this embodiment, for convenience of explanation, taking the foregoing embodiment as an example, as shown in fig. 9, there is a mouse pointer 902 in the first target original frame image 901, where the position of the mouse pointer is in the lower left corner of the first target original frame image 901, in the second target original frame image 903, the position of the mouse pointer 904 and the current position area 905 of the mouse pointer in the second target original frame image 903, compared with the change area in the first target original frame image 901, where the changed display element is the mouse pointer, the change type is the position change type, and then the target intermediate frame image may be generated based on the change, where the mouse pointer in the target intermediate frame image may be in the middle of the window, and the coordinates of the position where the mouse pointer is located may be an average of the coordinates of the front and rear two positions, as shown in the image 906.
In step S803, when the type of the display element change is the image change type, a target intermediate frame image is generated based on the image of the display element in the change region and the image of the display element in the first target original frame image.
In this embodiment of the present application, for convenience of explanation, taking the foregoing specific embodiment as an example, as shown in fig. 10, a square is displayed in the first target original frame image 1001, in the second target original frame image 1002, the square becomes a circle, then the changed display element is a window of the graph, the change type is the image change type, then the images in the intermediate target frame image may be generated according to the images before and after the change, as shown in the image 1003, and optionally, the images before and after the change may be both displayed in the form of a dotted line.
According to the embodiment of the application, the change type of the display element is determined, and the mode of generating the target intermediate frame image is determined based on different change types, so that the display effect is better.
The embodiment of the present application provides a possible implementation manner, in which, as shown in fig. 11, determining at least two target original frame images in image data based on a relationship between a display frame rate and a preset target frame rate includes:
Step S1101 of determining the number of target intermediate frame images to be inserted in a unit time based on a relationship between the display frame rate and a preset target frame rate.
In the embodiment of the present application, the display frame rate indicates the number of frames of images displayed per second of the image data to be displayed, the target frame rate indicates the number of frames of images displayed per second required by the client, and the number of target intermediate frame images to be inserted in a unit time can be calculated based on the target frame rate and the display frame rate. For example, if the display frame rate is 30 frames per second and the target frame rate is 60 frames per second, then 60-30=30 frames per second is calculated, i.e., 30 frames of target intermediate frame images need to be inserted per second.
Step S1102, determining a target original frame image based on the number of target intermediate frame images and the display frame rate.
In the embodiment of the application, after the number of target intermediate frame images to be inserted in a unit time is determined, a target original frame image to be subjected to frame insertion can be calculated based on the number of target intermediate frame images and the display frame rate. For example, if the target intermediate frame image to be inserted in a unit time is 30 frames and the display frame rate is also 30 frames, all the original frame images are target original frame images. For more clear illustration, taking another embodiment as an example, if the target intermediate frame image to be inserted in a unit time is 15 frames and the display frame rate is 60 frames, it means that the 15 frames need to be uniformly inserted into the 60 frames, and one target intermediate frame image is inserted into each 4 original frame images, it can be determined that the target original frame image is 30 frames.
According to the method and the device for inserting the frame, the target original frame image which needs to be subjected to frame inserting operation is determined through the display frame rate and the target frame rate, so that the accuracy of inserting the target intermediate frame image is guaranteed, and the fluency of image display is guaranteed.
The embodiment of the application provides a possible implementation manner, in the implementation manner, a corresponding target intermediate frame image is generated according to any two adjacent target original frame images, and the method comprises the following steps:
and mixing pixel points in any two adjacent target original frame images according to a preset proportion to generate corresponding target intermediate frame images.
In this embodiment of the present application, when generating the target intermediate frame image according to two adjacent target original frame images, the pixels of the two target original frame images may be directly mixed, where the mixing may be performed according to a preset ratio, for example, 50% of the pixels in the first target original frame image and 50% of the pixels in the second target original frame image are mixed to generate the target intermediate frame image, and optionally, the positions where 50% of the pixels in the first target original frame image are located and the positions where 50% of the pixels in the first target original frame image are located are not repeated, and the distribution is relatively uniform.
According to the method and the device for obtaining the target intermediate frame image, the adjacent two pixel points of the target original frame image are mixed, so that the target intermediate frame image can be obtained quickly, and the image processing efficiency is high.
The embodiment of the application provides a possible implementation manner, in this implementation manner, as shown in fig. 12, a corresponding target intermediate frame image is generated according to any two adjacent target original frame images, including:
step S1201, calculating a weighted average value of each pixel point in two adjacent target original frame images according to a preset weight;
step S1202, generating a target intermediate frame image according to the weighted average.
In this embodiment of the present application, when generating the corresponding target intermediate frame image according to any two adjacent target original frame images, the target intermediate frame image may be generated according to the pixel points in each target original frame image, and optionally, an average value of the corresponding pixel points in each target original frame image may be calculated, and the target intermediate frame image may be generated according to the average value. For example, for convenience of explanation, taking a simple embodiment as an example, each pixel point in the first target original frame image is 255, each pixel point in the second target original frame image is 1, and each pixel point in the target intermediate frame image is 178. Of course, in the practical implementation process, each pixel point may be different, but the calculation method is the same as the embodiment of the present application, and also belongs to the protection scope of the present application.
According to the method and the device, the average value of the pixel points in the two adjacent target original frame images is calculated to be used as the pixel point in the target intermediate frame image, the target intermediate frame image is closer to the target original frame image, and the image transition is smoother.
According to the method and the device for displaying the image data, the display frame rate and the target frame rate of the image data to be displayed are obtained, the target original frame image needing to be subjected to frame insertion is determined, the target intermediate frame image is generated based on the adjacent target original frame image, the target intermediate frame image is inserted between the corresponding adjacent target original frame images, the refreshing frame rate of the desktop can meet the preset requirement when the same image data are transmitted, the display smoothness of the desktop image is effectively improved, and the display is more continuous and smooth.
The embodiment of the present application provides an image processing apparatus, as shown in fig. 13, the image processing apparatus 130 may include: an image data acquisition module 1310, a target raw frame image determination module 1320, a target intermediate frame image generation module 1330, and a target image generation module 1340, wherein,
an image data acquisition module 1310, configured to acquire image data to be displayed, and determine a display frame rate of the image data;
A target raw frame image determining module 1320, configured to determine at least two target raw frame images in the image data based on a relationship between the display frame rate and a preset target frame rate;
the target intermediate frame image generating module 1330 is configured to generate, for any two adjacent target original frame images in the at least two target original frame images, a corresponding target intermediate frame image according to the any two adjacent target original frame images;
the target image generating module 1340 is configured to insert a target intermediate frame image between any two corresponding adjacent target original frame images, and generate target image data.
Optionally, the target intermediate frame image generating module 1330 may be configured to, when generating the corresponding target intermediate frame image according to any two adjacent target original frame images:
the method comprises the steps that a target original frame image with the front display time in any two adjacent target original frame images is taken as a first target original frame image, and a target original frame image with the rear display time is taken as a second target original frame image;
determining a change area of the second target original frame image compared with the first target original frame image;
a target intermediate frame image is generated based on the change region and the first target original frame image.
Optionally, the target intermediate frame image generating module 1330 may be configured to, when determining the change area of the second target original frame image compared to the first target original frame image:
acquiring a first tree diagram and a second tree diagram, wherein the first tree diagram is used for representing the construction relation of each display element in a first target original frame image, and the second tree diagram is used for representing the construction relation of each display element in a second target original frame image;
determining a node which is changed compared with the first tree diagram in the second tree diagram, and determining a display element which is changed compared with the first target original frame image in the second target original frame image based on the transformed node;
a change region is determined based on the changed display element.
Alternatively, the target intermediate frame image generating module 1330 may be configured to, when generating the target intermediate frame image based on the change area and the first target original frame image:
determining the type of the change of the display element in the change area;
when the type of the display element change is a position change type, generating a target intermediate frame image based on the position of the display element in the change area and the position of the display element in the first target original frame image;
When the type of the display element change is an image change type, a target intermediate frame image is generated based on the image of the display element in the change region and the image of the display element in the first target original frame image.
Optionally, the target raw frame image determining module 1320 determines at least two target raw frame images in the image data based on a relationship between the display frame rate and a preset target frame rate, including:
determining the number of target intermediate frame images to be inserted in a unit time based on a relation between a display frame rate and a preset target frame rate;
and determining a target original frame image based on the number of target intermediate frame images and the display frame rate.
Optionally, the target intermediate frame image generating module 1330 may be configured to, when generating the corresponding target intermediate frame image according to any two adjacent target original frame images:
and mixing pixel points in any two adjacent target original frame images according to a preset proportion to generate corresponding target intermediate frame images.
Optionally, the target intermediate frame image generating module 1330 may be configured to, when generating the corresponding target intermediate frame image according to any two adjacent target original frame images:
Calculating a weighted average value of each pixel point in two adjacent target original frame images according to preset weights;
and generating a target intermediate frame image according to the weighted average value.
The image processing apparatus according to the embodiment of the present application may perform the image processing method shown in the foregoing embodiment of the present application, and the implementation principle is similar, and will not be described herein.
According to the method and the device for displaying the image data, the display frame rate and the target frame rate of the image data to be displayed are obtained, the target original frame image needing to be subjected to frame insertion is determined, the target intermediate frame image is generated based on the adjacent target original frame image, the target intermediate frame image is inserted between the corresponding adjacent target original frame images, the refreshing frame rate of the desktop can meet the preset requirement when the same image data are transmitted, the display smoothness of the desktop image is effectively improved, and the display is more continuous and smooth.
An embodiment of the present application provides an electronic device, including: a memory and a processor; at least one program stored in the memory for execution by the processor, which, when executed by the processor, performs: according to the method and the device for displaying the image data, the display frame rate and the target frame rate of the image data to be displayed are obtained, the target original frame image needing to be subjected to frame insertion is determined, the target intermediate frame image is generated based on the adjacent target original frame image, the target intermediate frame image is inserted between the corresponding adjacent target original frame images, the refreshing frame rate of the desktop can meet the preset requirement when the same image data are transmitted, the display smoothness of the desktop image is effectively improved, and the display is more continuous and smooth.
In an alternative embodiment, there is provided an electronic device, as shown in fig. 14, the electronic device 14000 shown in fig. 14 includes: a processor 14001 and a memory 14003. Wherein the processor 14001 is coupled to the memory 14003, such as via a bus 14002. Optionally, the electronic device 14000 may also include a transceiver 14004. In practice, the transceiver 14004 is not limited to one, and the structure of the electronic device 14000 is not limited to the embodiment of the present application.
Processor 14001 may be a CPU (Central Processing Unit ), general purpose processor, DSP (Digital Signal Processor, data signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field Programmable Gate Array, field programmable gate array) or other programmable logic device, transistor logic device, hardware components, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules, and circuits described in connection with this disclosure. The processor 14001 may also be a combination that implements computing functionality, e.g., comprising one or more microprocessor combinations, a combination of a DSP and a microprocessor, etc.
Bus 14002 may include a pathway to transfer information between the aforementioned components. Bus 14002 may be a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus or EISA (Extended Industry Standard Architecture ) bus, or the like. The bus 14002 can be classified into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 8, but not only one bus or one type of bus.
Memory 14003 may be, but is not limited to, ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, RAM (Random Access Memory ) or other type of dynamic storage device that can store information and instructions, EEPROM (Electrically Erasable Programmable Read Only Memory ), CD-ROM (Compact Disc Read Only Memory, compact disc Read Only Memory) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
The memory 14003 is used for storing application program codes for executing the present application and is controlled to be executed by the processor 14001. The processor 14001 is configured to execute the application program code stored in the memory 14003 to implement what is shown in the foregoing method embodiments.
The electronic equipment comprises, but is not limited to, computers, mobile phones, tablet computers and the like.
The present application provides a computer readable storage medium having a computer program stored thereon, which when run on a computer, causes the computer to perform the corresponding method embodiments described above. Compared with the prior art, the method and the device have the advantages that the display frame rate and the target frame rate of the image data to be displayed are obtained, the target original frame image needing to be subjected to frame insertion is determined, the target intermediate frame image is generated based on the adjacent target original frame image, the target intermediate frame image is inserted between the corresponding adjacent target original frame images, the refreshing frame rate of the desktop can meet the preset requirement under the condition of transmitting the same graphic data, the smoothness of desktop image display is effectively improved, and the display is more continuous and smooth.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for a person skilled in the art, several improvements and modifications can be made without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (9)

1. An image processing method, comprising:
acquiring image data to be displayed, and determining a display frame rate of the image data;
determining at least two target original frame images in the image data based on the relation between the display frame rate and a preset target frame rate;
for any two adjacent target original frame images in the at least two target original frame images, generating corresponding target intermediate frame images according to the any two adjacent target original frame images;
inserting the target intermediate frame image between the corresponding two adjacent target original frame images to generate target image data;
the generating a corresponding target intermediate frame image according to the arbitrary two adjacent target original frame images includes:
the target original frame image with the front display time in any two adjacent target original frame images is taken as a first target original frame image, and the target original frame image with the rear display time is taken as a second target original frame image;
Determining a change area of the second target original frame image compared with the first target original frame image;
the target intermediate frame image is generated based on the change region and the first target original frame image.
2. The image processing method according to claim 1, wherein determining a change area of the second target original frame image compared to the first target original frame image includes:
acquiring a first tree diagram and a second tree diagram, wherein the first tree diagram is used for representing the construction relation of each display element in the first target original frame image, and the second tree diagram is used for representing the construction relation of each display element in the second target original frame image;
determining a node which is changed in the second tree diagram compared with the first tree diagram, and determining a display element which is changed in the second target original frame image compared with the first target original frame image based on the changed node;
the change region is determined based on the changed display element.
3. The image processing method according to claim 1, wherein the generating the target intermediate frame image based on the change region and the first target original frame image includes:
Determining the type of the display element in the change area;
when the type of the display element change is a position change type, generating the target intermediate frame image based on the position of the display element in the change area and the position of the display element in the first target original frame image;
and when the type of the display element changed is an image change type, generating the target intermediate frame image based on the image of the display element in the change area and the image of the display element in the first target original frame image.
4. The image processing method according to claim 1, wherein the determining at least two target original frame images in the image data based on the relationship between the display frame rate and a preset target frame rate includes:
determining the number of target intermediate frame images to be inserted in unit time based on the relation between the display frame rate and a preset target frame rate;
and determining a target original frame image based on the number of target intermediate frame images and the display frame rate.
5. The image processing method according to claim 1, wherein the generating a corresponding target intermediate frame image from the arbitrary two adjacent target original frame images includes:
And mixing the pixel points in any two adjacent target original frame images according to a preset proportion to generate corresponding target intermediate frame images.
6. The image processing method according to claim 1, wherein the generating a corresponding target intermediate frame image from the arbitrary two adjacent target original frame images includes:
calculating a weighted average value of each pixel point in two adjacent target original frame images according to preset weights;
and generating a target intermediate frame image according to the weighted average value.
7. An image processing apparatus, comprising:
the image data acquisition module is used for acquiring image data to be displayed and determining the display frame rate of the image data;
a target original frame image determining module, configured to determine at least two target original frame images in the image data based on a relationship between the display frame rate and a preset target frame rate;
the target intermediate frame image generation module is used for generating corresponding target intermediate frame images according to any two adjacent target original frame images in the at least two target original frame images;
The target image generation module is used for inserting the target intermediate frame image between the corresponding any two adjacent target original frame images to generate target image data;
the generating a corresponding target intermediate frame image according to the arbitrary two adjacent target original frame images includes:
the target original frame image with the front display time in any two adjacent target original frame images is taken as a first target original frame image, and the target original frame image with the rear display time is taken as a second target original frame image;
determining a change area of the second target original frame image compared with the first target original frame image;
the target intermediate frame image is generated based on the change region and the first target original frame image.
8. An electronic device, comprising:
a processor, a memory, and a bus;
the bus is used for connecting the processor and the memory;
the memory is used for storing operation instructions;
the processor is configured to execute the image processing method according to any one of claims 1 to 6 by calling the operation instruction.
9. A computer readable storage medium storing at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program, the code set, or instruction set being loaded and executed by a processor to implement the image processing method of any one of claims 1-6.
CN202011594887.3A 2020-12-29 2020-12-29 Image processing method, device, electronic equipment and computer readable storage medium Active CN112596843B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011594887.3A CN112596843B (en) 2020-12-29 2020-12-29 Image processing method, device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011594887.3A CN112596843B (en) 2020-12-29 2020-12-29 Image processing method, device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112596843A CN112596843A (en) 2021-04-02
CN112596843B true CN112596843B (en) 2023-07-25

Family

ID=75203562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011594887.3A Active CN112596843B (en) 2020-12-29 2020-12-29 Image processing method, device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112596843B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113225546A (en) * 2021-04-25 2021-08-06 Oppo广东移动通信有限公司 Color temperature adjusting method and device, electronic equipment and computer readable storage medium
CN113837136B (en) * 2021-09-29 2022-12-23 深圳市慧鲤科技有限公司 Video frame insertion method and device, electronic equipment and storage medium
CN114205648A (en) * 2021-12-07 2022-03-18 网易(杭州)网络有限公司 Frame interpolation method and device
CN114025105B (en) * 2021-12-15 2023-11-28 北京达佳互联信息技术有限公司 Video processing method, device, electronic equipment and storage medium
CN114490671B (en) * 2022-03-31 2022-07-29 北京华建云鼎科技股份公司 Client-side same-screen data synchronization system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6323856B1 (en) * 1998-03-19 2001-11-27 Shmuel Banitt Method for processing variable speed scenes for computer games
CN105760132A (en) * 2016-02-03 2016-07-13 广东欧珀移动通信有限公司 Method, device and mobile device for achieving frame rate dynamic refreshing
CN109348124A (en) * 2018-10-23 2019-02-15 Oppo广东移动通信有限公司 Image transfer method, device, electronic equipment and storage medium
CN109379625A (en) * 2018-11-27 2019-02-22 Oppo广东移动通信有限公司 Method for processing video frequency, device, electronic equipment and computer-readable medium
CN110267098A (en) * 2019-06-28 2019-09-20 连尚(新昌)网络科技有限公司 A kind of method for processing video frequency and terminal
CN111107427A (en) * 2019-11-20 2020-05-05 Oppo广东移动通信有限公司 Image processing method and related product
CN111323775A (en) * 2020-01-19 2020-06-23 上海眼控科技股份有限公司 Image processing method, image processing device, computer equipment and storage medium
CN111741303A (en) * 2020-06-09 2020-10-02 Oppo广东移动通信有限公司 Deep video processing method and device, storage medium and electronic equipment
CN111862183A (en) * 2020-07-02 2020-10-30 Oppo广东移动通信有限公司 Depth image processing method and system, electronic device and storage medium
CN111918066A (en) * 2020-09-08 2020-11-10 北京字节跳动网络技术有限公司 Video encoding method, device, equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6323856B1 (en) * 1998-03-19 2001-11-27 Shmuel Banitt Method for processing variable speed scenes for computer games
CN105760132A (en) * 2016-02-03 2016-07-13 广东欧珀移动通信有限公司 Method, device and mobile device for achieving frame rate dynamic refreshing
CN109348124A (en) * 2018-10-23 2019-02-15 Oppo广东移动通信有限公司 Image transfer method, device, electronic equipment and storage medium
CN109379625A (en) * 2018-11-27 2019-02-22 Oppo广东移动通信有限公司 Method for processing video frequency, device, electronic equipment and computer-readable medium
CN110267098A (en) * 2019-06-28 2019-09-20 连尚(新昌)网络科技有限公司 A kind of method for processing video frequency and terminal
CN111107427A (en) * 2019-11-20 2020-05-05 Oppo广东移动通信有限公司 Image processing method and related product
CN111323775A (en) * 2020-01-19 2020-06-23 上海眼控科技股份有限公司 Image processing method, image processing device, computer equipment and storage medium
CN111741303A (en) * 2020-06-09 2020-10-02 Oppo广东移动通信有限公司 Deep video processing method and device, storage medium and electronic equipment
CN111862183A (en) * 2020-07-02 2020-10-30 Oppo广东移动通信有限公司 Depth image processing method and system, electronic device and storage medium
CN111918066A (en) * 2020-09-08 2020-11-10 北京字节跳动网络技术有限公司 Video encoding method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112596843A (en) 2021-04-02

Similar Documents

Publication Publication Date Title
CN112596843B (en) Image processing method, device, electronic equipment and computer readable storage medium
CN109379625B (en) Video processing method, video processing device, electronic equipment and computer readable medium
US11706484B2 (en) Video processing method, electronic device and computer-readable medium
CN110377264B (en) Layer synthesis method, device, electronic equipment and storage medium
CN110377263B (en) Image synthesis method, image synthesis device, electronic equipment and storage medium
CN106611435B (en) Animation processing method and device
CN112614202B (en) GUI rendering display method, terminal, server, electronic equipment and storage medium
EP3886444A1 (en) Video processing method and apparatus, and electronic device and computer-readable medium
EP3568833B1 (en) Methods for dynamic image color remapping using alpha blending
CN110363831B (en) Layer composition method and device, electronic equipment and storage medium
US7616220B2 (en) Spatio-temporal generation of motion blur
CN113126937A (en) Display terminal adjusting method and display terminal
CN107707965B (en) Bullet screen generation method and device
WO2022218042A1 (en) Video processing method and apparatus, and video player, electronic device and readable medium
US11810524B2 (en) Virtual reality display device and control method thereof
JP3819873B2 (en) 3D image display apparatus and program
CN109091866B (en) Display control method and device, computer readable medium and electronic equipment
CN115775204A (en) Image super-resolution method, device, server and storage medium
CN109859328B (en) Scene switching method, device, equipment and medium
CN113810755B (en) Panoramic video preview method and device, electronic equipment and storage medium
CN116966546A (en) Image processing method, apparatus, medium, device, and program product
CN113934500A (en) Rendering method, rendering device, storage medium and electronic equipment
JP2018005226A (en) System and method for overlaying multi-source media in vram (video random access memory)
CN113691866B (en) Video processing method, device, electronic equipment and medium
WO2016163020A1 (en) Frame interpolation device, frame interpolation method and frame interpolation program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant