CN117611712A - Image processing method, device, electronic equipment and storage medium - Google Patents

Image processing method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117611712A
CN117611712A CN202311492984.5A CN202311492984A CN117611712A CN 117611712 A CN117611712 A CN 117611712A CN 202311492984 A CN202311492984 A CN 202311492984A CN 117611712 A CN117611712 A CN 117611712A
Authority
CN
China
Prior art keywords
channel data
data
channel
image
jpeg
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311492984.5A
Other languages
Chinese (zh)
Inventor
艾尼
汪洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202311492984.5A priority Critical patent/CN117611712A/en
Publication of CN117611712A publication Critical patent/CN117611712A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Abstract

The application provides an image processing method, an image processing device, electronic equipment and a storage medium, and relates to the technical field of computer vision. The method comprises the following steps: the method comprises the steps of loading color YUV channel data and transparency A channel data of an image to be displayed into a memory; reading the YUV channel data and the A channel data of the image to be displayed from the memory by a graphic processor GPU, and generating RGBA channel data based on the read YUV channel data and the A channel data; and rendering and displaying the image to be displayed based on the RGBA channel data. According to the method and the device, the YUV channel data and the A channel data are loaded into the memory instead of directly loading the RGBA data, and RGBA data required by rendering are generated by utilizing the GPU, so that memory expenditure can be greatly saved on the premise of guaranteeing display effect.

Description

Image processing method, device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer vision, and in particular, to an image processing method, an image processing device, an electronic device, and a storage medium.
Background
In some scenes, an ambient effect on an animation or full screen is typically achieved with a sequence of images. The number of images in the image sequence corresponding to the animation with longer time is also larger.
In the related art, for the purposes of being friendly to design tools and having transparency, an image in PNG format is generally adopted, and RGBA data included in the image in PNG format needs to occupy more storage space. However, when an animation is presented using an image sequence including a large number of pictures, a problem of large memory overhead is caused.
Disclosure of Invention
The application provides an image processing method, an image processing device, electronic equipment and a storage medium, which can solve the problem of high memory overhead in the related technology. The technical scheme is as follows:
in one aspect, the present application provides an image processing method, including:
the method comprises the steps of loading color YUV channel data and transparency A channel data of an image to be displayed into a memory;
reading the YUV channel data and the A channel data of the image to be displayed from the memory by a graphic processor GPU, and generating RGBA channel data based on the read YUV channel data and the A channel data;
and rendering and displaying the image to be displayed based on the RGBA channel data.
In some embodiments, the generating RGBA channel data based on the read YUV channel data and the a channel data includes:
Generating RGB channel data based on the read YUV channel data;
the RGBA channel data is generated based on the read a channel data and the RGB channel data.
In some embodiments, the generating the RGBA channel data based on the read a-channel data and the RGB channel data includes:
performing inverse compression transformation on the read A-channel data based on a target compression ratio to obtain A-channel data of each pixel point in the image to be displayed;
the read A channel data are obtained by compressing the A channel data of each pixel point of the image to be displayed based on the target compression ratio;
and generating RGBA channel data of each pixel point based on the A channel data and the RGB channel data of each pixel point.
In some embodiments, the loading the color YUV channel data and the transparency a channel data of the image to be displayed into the memory includes:
receiving first JPEG data and second JPEG data of the image to be displayed, wherein the first JPEG data carries the YUV channel data, and the second JPEG data carries the A channel data, which are sent by a server;
and loading the first JPEG data and the second JPEG data into the memory.
In some embodiments, the reading, by the graphics processor GPU, the YUV channel data and the a channel data of the image to be displayed from the memory includes:
reading the first JPEG data and the second JPEG data of the image to be displayed from the memory through the GPU;
wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel, and the second JPEG data is single-channel data comprising a Y channel;
acquiring the YUV channel data from a Y channel, a U channel and a V channel of the first JPEG data;
and acquiring the A channel data from the Y channel of the second JPEG data.
In another aspect, the present application provides an image processing method, including:
RGBA channel data of an image to be displayed are obtained;
generating first JPEG data carrying YUV channel data based on RGB channel data in the RGBA channel data, wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel;
generating second JPEG data carrying the A-channel data based on transparency A-channel data in the RGBA-channel data, wherein the second JPEG data is single-channel data comprising a Y channel;
And sending the first JPEG data and the second JPEG data of the image to be displayed to a terminal, so that the terminal loads the YUV channel data and the A channel data of the image to be displayed into a memory.
In another aspect, the present application provides an image processing apparatus, including:
the loading module is configured to load the color YUV channel data and the transparency A channel data of the image to be displayed into the memory;
a reading module configured to read, by a graphics processor GPU, the YUV channel data and the a channel data of the image to be displayed from the memory, and generate RGBA channel data based on the read YUV channel data and the a channel data;
and the rendering module is configured to render and display the image to be displayed based on the RGBA channel data.
In some embodiments, the read module comprises:
a first reading unit configured to generate RGB channel data based on the read YUV channel data;
and a generating unit configured to generate the RGBA channel data based on the read a channel data and the RGB channel data.
In some embodiments, the generating unit is configured to:
Performing inverse compression transformation on the read A-channel data based on a target compression ratio to obtain A-channel data of each pixel point in the image to be displayed;
the read A channel data are obtained by compressing the A channel data of each pixel point of the image to be displayed based on the target compression ratio;
and generating RGBA channel data of each pixel point based on the A channel data and the RGB channel data of each pixel point.
In some embodiments, the loading module comprises:
a receiving unit configured to receive first JPEG data and second JPEG data of the image to be displayed, the first JPEG data carrying the YUV channel data, and the second JPEG data carrying the a channel data, which are transmitted by a server;
and the loading unit is configured to load the first JPEG data and the second JPEG data into the memory.
In some embodiments, the rendering module comprises:
the second reading unit is configured to read the first JPEG data and the second JPEG data of the image to be displayed from the memory through the GPU;
wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel, and the second JPEG data is single-channel data comprising a Y channel;
An acquisition unit configured to acquire the YUV channel data from a Y channel, a U channel, and a V channel of the first JPEG data;
the acquisition unit is further configured to acquire the a-channel data from a Y-channel of the second JPEG data.
In another aspect, the present application provides an image processing apparatus, including:
an acquisition module configured to acquire RGBA channel data of an image to be displayed;
a first generation module configured to generate first JPEG data carrying YUV channel data based on RGB channel data in the RGBA channel data, the first JPEG data being multi-channel data including a Y channel, a U channel, and a V channel;
a second generation module configured to generate second JPEG data carrying the a-channel data based on transparency a-channel data in the RGBA-channel data, the second JPEG data being single channel data including a Y-channel;
and the sending module is configured to send the first JPEG data and the second JPEG data of the image to be displayed to a terminal so that the terminal loads the YUV channel data and the A channel data of the image to be displayed into a memory.
In another aspect, an electronic device is provided that includes a memory, a processor, and a computer program stored on the memory, the processor executing the computer program to implement the image processing method described above.
In another aspect, a computer readable storage medium is provided, on which a computer program is stored, which when executed by a processor implements the above-described image processing method.
In another aspect, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the above-described image processing method.
The beneficial effects that technical scheme that this application embodiment provided brought are:
according to the image processing method, color YUV channel data and transparency A channel data of an image to be displayed are loaded into a memory; reading YUV channel data and A channel data of an image to be displayed from a memory by a graphic processor GPU, and generating RGBA channel data required for rendering based on the read YUV channel data and A channel data; and rendering and displaying the image to be displayed based on the RGBA channel data. According to the method and the device, the YUV channel data and the A channel data are loaded into the memory instead of directly loading the RGBA data, and RGBA data required by rendering are generated by utilizing the GPU, so that memory overhead is greatly saved on the premise of guaranteeing display effect.
According to the image processing method, RGBA channel data of an image to be displayed are obtained; generating first JPEG data carrying YUV channel data based on RGB channel data in the RGBA channel data, wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel; generating second JPEG data carrying the A-channel data based on the transparency A-channel data in the RGBA-channel data, the second JPEG data being single-channel data including a Y-channel; the first JPEG data and the second JPEG data of the image to be displayed are sent to the terminal, so that the terminal loads the YUV channel data and the A channel data of the image to be displayed into a memory instead of directly loading RGBA data, and the memory overhead of the terminal is greatly saved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings that are required to be used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic view of an implementation environment of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 3 is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 4 is a signaling interaction schematic diagram of an image processing method according to an embodiment of the present application;
fig. 5 is a schematic diagram of an image processing procedure according to an embodiment of the present application;
fig. 6 is a schematic diagram of an image processing procedure according to the related art provided in the embodiment of the present application;
fig. 7 is a schematic diagram of an image processing procedure according to the related art provided in the embodiment of the present application;
FIG. 8 is a schematic diagram of comparing various images according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the drawings in the present application. It should be understood that the embodiments described below with reference to the drawings are exemplary descriptions for explaining the technical solutions of the embodiments of the present application, and the technical solutions of the embodiments of the present application are not limited.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. The terms "comprises" and "comprising" as used in the embodiments of the present application mean that the corresponding features may be implemented as presented features, information, data, steps, operations, but do not exclude the implementation as other features, information, data, steps, operations, etc. supported by the state of the art.
Fig. 1 is a schematic diagram of an implementation environment of a live-action method provided in the present application. As shown in fig. 1, the implementation environment includes: server 101 and terminal 102, the server 101 may be a background server of an application program. The terminal 102 is installed with an application program, and the terminal 102 and the server 102 can perform data interaction based on the application program. The application program has an image sequence display function, and the terminal 102 can display an image sequence including at least one frame image in a page of the application program. For example, the application program may be any application such as a live application, a video application, and the like.
The server 101 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server or a server cluster providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), basic cloud computing services such as big data and artificial intelligence platforms, and the like. The network may include, but is not limited to: a wired network, a wireless network, wherein the wired network comprises: local area networks, metropolitan area networks, and wide area networks, the wireless network comprising: bluetooth, wi-Fi, and other networks implementing wireless communications.
By way of example, the terminal 102 may be a smart phone (e.g., android phone, iOS phone, etc.), a tablet computer, a notebook computer, a digital broadcast receiver, a MID (Mobile Internet Devices, mobile internet device), a PDA (personal digital assistant), a desktop computer, a vehicle-mounted terminal (e.g., a vehicle-mounted navigation terminal, a vehicle-mounted computer, etc.), a smart speaker, a smart watch, etc., and the terminal and the server may be directly or indirectly connected through wired or wireless communication, but are not limited thereto. And in particular, the method can be determined based on actual application scene requirements, and is not limited herein.
Fig. 2 is a schematic flow chart of a live broadcast method according to an embodiment of the present application. The execution subject of the method may be a terminal. As shown in fig. 2, the method includes the following steps.
Step 201, loading color YUV channel data and transparency A channel data of an image to be displayed into a memory;
step 202, reading the YUV channel data and the a channel data of the image to be displayed from the memory by a graphics processor GPU, and generating RGBA channel data based on the read YUV channel data and the a channel data;
and 203, rendering and displaying the image to be displayed based on the RGBA channel data.
According to the image processing method, color YUV channel data and transparency A channel data of an image to be displayed are loaded into a memory; reading YUV channel data and A channel data of an image to be displayed from a memory by a graphic processor GPU, and generating RGBA channel data required for rendering based on the read YUV channel data and A channel data; and rendering and displaying the image to be displayed based on the RGBA channel data. According to the method and the device, the YUV channel data and the A channel data are loaded into the memory instead of directly loading the RGBA data, and RGBA data required by rendering are generated by utilizing the GPU, so that memory overhead is greatly saved on the premise of guaranteeing display effect.
In some embodiments, the generating RGBA channel data based on the read YUV channel data and the a channel data includes:
generating RGB channel data based on the read YUV channel data;
the RGBA channel data is generated based on the read A channel data and the RGB channel data.
In the embodiment of the application, by generating the RGB channel data based on the YUV channel data, RGBA channel data required by image rendering can be obtained based on the A channel data and the RGB channel data; because the memory occupied by the YUV channel data is smaller than that occupied by the RGB channel data, the memory occupied by the YUV channel data and the A channel data is smaller than that occupied by the RGBA channel data, and therefore memory overhead is saved.
In some embodiments, the generating the RGBA channel data based on the read a-channel data and the RGB channel data includes:
performing inverse compression transformation on the read A-channel data based on the target compression ratio to obtain A-channel data of each pixel point in the image to be displayed;
the read A channel data are obtained by compressing the A channel data of each pixel point of the image to be displayed based on the target compression ratio;
RGBA channel data of each pixel point is generated based on the A channel data and the RGB channel data of the pixel point.
In the embodiment of the present application, since the a-channel data stored in the memory is data compressed according to the target compression ratio, the a-channel data of each pixel point can be restored by performing inverse compression on the read a-channel data according to the target compression ratio, so as to obtain RGBA-channel data of each pixel point. The display effect of the image has low requirement on the accuracy of the data of the channel A (transparency), so that the memory space can be further saved on the premise of not influencing the display effect and not losing the color by compressing the transparency data.
In some embodiments, the loading the color YUV channel data and the transparency a channel data of the image to be displayed into the memory includes:
receiving first JPEG data and second JPEG data of the image to be displayed, wherein the first JPEG data carries the YUV channel data, and the second JPEG data carries the A channel data;
and loading the first JPEG data and the second JPEG data into the memory.
In the embodiment of the application, the YUV channel data and the A channel data are stored by adopting the JPEG format, and the memory space occupied by the JPEG data is smaller than that occupied by the PNG format data, so that the memory is saved.
For example, for an image of the same size, the memory occupied by the YUV channel data and the a channel data in the JPEG format using the encoding format of YUV-i420 is 62.5% of the memory occupied by the RGBA channel data in the PNG format.
In some embodiments, the reading, by a graphics processor GPU, the YUV channel data and the a channel data of the image to be displayed from the memory includes:
reading the first JPEG data and the second JPEG data of the image to be displayed from the memory through the GPU;
wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel, and the second JPEG data is single-channel data comprising a Y channel;
acquiring the YUV channel data from the Y channel, the U channel and the V channel of the first JPEG data;
the A-channel data is acquired from the Y-channel of the second JPEG data.
In the embodiment of the application, the second JPEG data is single-channel data only including a Y channel, and the size of the second JPEG data including the a channel data is reduced as much as possible on the premise of ensuring complete storage of the a channel data by storing the a channel data into the Y channel in the second JPEG data; the display requirement is guaranteed, the memory overhead is saved as much as possible, and the practicability of image processing is improved.
Fig. 3 is a schematic flow chart of a live broadcast method according to an embodiment of the present application. The method may be executed by a server. As shown in fig. 3, the method includes the following steps.
Step 301, acquiring RGBA channel data of an image to be displayed;
step 302, generating first JPEG data carrying YUV channel data based on RGB channel data in RGBA channel data, wherein the first JPEG data is multi-channel data comprising Y channel, U channel and V channel;
step 303, generating second JPEG data carrying the a-channel data based on the transparency a-channel data in the RGBA-channel data, the second JPEG data being single channel data including a Y-channel;
step 304, the first JPEG data and the second JPEG data of the image to be displayed are sent to a terminal, so that the terminal loads the YUV channel data and the a channel data of the image to be displayed into a memory.
According to the image processing method, RGBA channel data of an image to be displayed are obtained; generating first JPEG data carrying YUV channel data based on RGB channel data in the RGBA channel data, wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel; generating second JPEG data carrying the A-channel data based on the transparency A-channel data in the RGBA-channel data, the second JPEG data being single-channel data including a Y-channel; the first JPEG data and the second JPEG data of the image to be displayed are sent to the terminal, so that the terminal loads the YUV channel data and the A channel data of the image to be displayed into a memory instead of directly loading RGBA data, and the memory overhead of the terminal is greatly saved.
Fig. 2 and 3 are basic flow of the image processing method, and the flow of the live-broadcast decision method is further described below based on fig. 4.
Referring to fig. 4, fig. 4 is a flowchart illustrating an image processing method implemented by interaction between a terminal and a server, according to an exemplary embodiment, the method comprising the steps of:
step 401, a server acquires RGBA channel data of an image to be displayed.
Among the RGBA channel data, data of R (Red), G (Green), B (Blue) color channels and Alpha transparency channels are included. RGBA channel data is data required when rendering and displaying an image to be displayed.
In this step, the server may acquire PNG data including RGBA channel data of the image to be displayed, that is, RGBA channel data stored in PNG format.
Step 402, the server generates first JPEG data carrying YUV channel data based on RGB channel data in the RGBA channel data, the first JPEG data being multi-channel data including Y channel, U channel and V channel.
Wherein, in the YUV channel data, Y (luminence or Luma) represents a brightness channel; u and V represent Chrominance channels (chromance or Chroma) that are used to describe the image color and saturation to represent the color of the pixel.
In this step, the server may convert RGB channel data into YUV channel data based on a color space conversion algorithm between RGB and YUV, and store the YUV channel data in a JPEG format to obtain first JPEG data. For example, the first JPEG data may be stored with a sampling ratio of 4:2:0, i.e., the first JPEG data is encoded with YUV-i 420.
Step 403, the server generates second JPEG data carrying the a-channel data based on the transparency a-channel data in the RGBA-channel data, the second JPEG data being single channel data including a Y-channel.
The server may store the data of the a channel according to the JPEG format, and may specifically store the data of the a channel into the Y channel of the JPEG format, to obtain second JPEG data only including the Y channel, that is, second JPEG data only storing the data of the a channel in the Y channel.
In one possible manner, step 403 may include: the server can store the A channel data according to the JPEG format to obtain second JPEG data.
In another possible mode, the data of the channel A can be compressed first, and the compressed data can be stored; accordingly, step 403 may include: the server compresses the A channel data according to the target compression ratio and stores the A channel data according to a JPEG format to obtain the second JPEG data. For example, the target compression ratio may be half of each of the wide and high compression; that is, the width and height of the A channel data can be compressed by 50%. Of course, other ratios of compression may be used, as this application is not limited in this regard.
As shown in fig. 5, the server may split RGBA channel data of the PNG image into a first JPEG data containing YUV channel data and a second JPEG data containing transparency data only in the Y channel. The transparency accuracy requirement is not high, so that the width and height of the transparency channel data can be reduced to half, namely, 0.5w is 0.5h; if coding is performed according to YUV-i420 in JPEG, YUV channel data in the first JPEG data is w 1.5; the memory size occupied by the first JPEG data and the second JPEG data is w×h×1.75. And the size of RGBA channel data in PNG format is w.times.h.times.4. Therefore, in the present application, the memory size occupied by the first JPEG data and the second JPEG data can be reduced to 43.75% of the RGBA channel data in the PNG format in the related art.
In addition, if the a channel data is not compressed, the memory size occupied by the first JPEG data and the second JPEG data is w×h×2.5, which can be reduced to 62.5% of RGBA channel data in PNG format in related art.
Step 404, the server sends the first JPEG data and the second JPEG data of the image to be displayed to the terminal, so that the terminal loads the YUV channel data and the a channel data of the image to be displayed into the memory.
Step 405, a terminal receives first JPEG data and second JPEG data of the image to be displayed, which are sent by a server, and loads the first JPEG data and the second JPEG data into the memory.
Wherein the first JPEG data carries the YUV channel data, and the second JPEG data carries the A channel data.
It should be noted that, step 405 is an implementation manner of "the terminal loads the color YUV channel data and the transparency a channel data of the image to be displayed into the memory". The YUV channel data and the a channel data are respectively stored in a JPEG format, and then the terminal loads the YUV channel data and the a channel data of the image to be displayed into the memory, that is, the terminal loads the first JPEG data including the YUV channel data and the second JPEG data including the a channel data into the memory.
Step 406, the terminal reads the YUV channel data and the a channel data of the image to be displayed from the memory through the GPU, and generates RGBA channel data based on the read YUV channel data and the a channel data.
The terminal can read the first JPEG data and the second JPEG data of the image to be displayed from the memory through the GPU (Graphics Processing Unit, graphics processor), and obtain YUV channel data and A channel data based on the first JPEG data and the second JPEG data.
In one possible implementation manner, in the step 406, the reading, by the graphics processor GPU, the YUV channel data and the a channel data of the image to be displayed from the memory may include the following steps:
the terminal reads the first JPEG data and the second JPEG data of the image to be displayed from the memory through the GPU;
wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel, and the second JPEG data is single-channel data comprising a Y channel;
the terminal acquires the YUV channel data from the Y channel, the U channel and the V channel of the first JPEG data;
the terminal acquires the A channel data from the Y channel of the second JPEG data.
The first JPEG data is data comprising 3 channels, namely a Y channel, a U channel and a V channel, so that the terminal can directly extract the YUV channel data from the first JPEG data.
Wherein, since the server stores the A-channel data into the Y-channel in the second JPEG data, the terminal can extract the A-channel data from the Y-channel.
In one possible implementation, the generating RGBA channel data based on the read YUV channel data and the a channel data includes the steps of:
The terminal generates RGB channel data based on the read YUV channel data;
the terminal generates the RGBA channel data based on the read A channel data and the RGB channel data.
The terminal can convert YUV channel data into RGB channel data based on a color space conversion algorithm between YUV and RGB; RGBA channel data is obtained based on the RGB channel data and the A channel data.
In one possible implementation, the terminal may generate RGB channel data for each pixel point. Correspondingly, the terminal generates RGB channel data based on the read YUV channel data, and comprises the following steps: for each pixel point in the image to be displayed, the terminal constructs an original vector corresponding to each pixel point based on YUV channel data of the pixel point; the terminal converts the original vector of each pixel point into a target vector through a target conversion matrix, wherein the target vector is used for representing RGB channel data of each pixel point.
In one possible implementation, if the server compresses the a-channel data, the read data in the second JPEG data is compressed data, and then the read data needs to be decompressed. Correspondingly, the generating the RGBA channel data based on the read a channel data and the RGB channel data includes the steps of:
The terminal performs inverse compression transformation on the read A-channel data based on a target compression ratio to obtain A-channel data of each pixel point in the image to be displayed;
the read A channel data are obtained by compressing the A channel data of each pixel point of the image to be displayed based on the target compression ratio;
the terminal generates RGBA channel data of each pixel point based on the A channel data and the RGB channel data of the each pixel point.
In this step, the terminal may perform inverse compression transformation on the read a-channel data according to the target compression ratio, that is, an inverse process of compression according to the target compression ratio. For example, if the target compression ratio is half of the original compression ratio for each of the wide and high compression ratios, the read a-channel data can be restored to the original size based on the target compression ratio at the time of the inverse compression conversion.
And step 407, rendering and displaying the image to be displayed by the terminal based on the RGBA channel data.
In the step, the terminal can render the image to be displayed through the GPU; and rendering and displaying the image to be displayed in the GPU based on the RGB channel data and the A channel data of each pixel point in the image to be displayed. The GPU rendering process can utilize the video memory space, so that the memory overhead can be saved.
In one possible manner, the image to be displayed may be any frame image in the image sequence. For example, the image sequence may be an animation including continuous multi-frame images, and the method of the present application may be adopted for each frame image in the image sequence, where the continuous multi-frame images are displayed by the GPU frame by frame based on YUV channel data and a channel data of each frame image, so as to implement a playing process of the animation.
For another example, the image sequence may be a full-screen special effect including multiple frames of images, for example, including 4 frames of images corresponding to the upper left, the upper right, the lower left and the lower right of the screen, and each frame of image may be displayed by using the method of the present application to realize full-screen display of the special effect.
As shown in fig. 6, in the related art, the method of the present application is not used, but the PNG format RGBA channel data is directly loaded into the memory, that is, the terminal directly loads the PNG format data sent by the server into the memory, so that the PNG format RGBA channel data needs to be stored in the memory, and the memory consumption is w×h×4 (bytes).
As shown in fig. 7, even without considering transparency information, YUV channel data in JPEG format is directly used; that is, the server transmits the YUV channel data in JPEG format to the terminal, and the terminal converts the YUV channel data in JPEG format into RGBA channel data or RGB channel data for rendering and using in decoding stage, and the corresponding memory consumption is w×h×4 (bytes) or w×h×4 (bytes). Therefore, the memory overhead is still large after decoding and reading to the memory.
As shown in fig. 8, from left to right, there are: the size of the memory occupied by the image corresponding to RGBA channel data in PNG format, the image corresponding to YUV channel data in JPEG format and the image corresponding to A channel data in JPEG format is as follows in sequence from left to right: w.h.4 (bytes), w.h.1.5 (bytes), w.h.0.25 (bytes); based on the above, the memory size occupied by the YUV channel data in the JPEG format and the a channel data in the JPEG format adopted in the present application is w×h×1.75 (bytes); compared with the prior art that RGBA channel data in PNG format occupies w.times.h.times.4 (bytes) of memory, the memory overhead of the method can be reduced to 43.75% of that of the prior art.
Especially in some special effects development scenes, where transparency information, i.e. a-channel data, is often required, animation or some full-screen atmosphere effect is often achieved with sequential images. The YUV channel data in the JPEG format shown in fig. 8 does not have transparency information, and by adding a single channel second JPEG data with transparency information, the present invention can greatly save memory overhead without affecting normal display requirements.
According to the image processing method, color YUV channel data and transparency A channel data of an image to be displayed are loaded into a memory; reading YUV channel data and A channel data of an image to be displayed from a memory by a graphic processor GPU, and generating RGBA channel data required for rendering based on the read YUV channel data and A channel data; and rendering and displaying the image to be displayed based on the RGBA channel data. According to the method and the device, the YUV channel data and the A channel data are loaded into the memory instead of directly loading the RGBA data, and RGBA data required by rendering are generated by utilizing the GPU, so that memory overhead is greatly saved on the premise of guaranteeing display effect.
Fig. 9 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application. As shown in fig. 9, the apparatus includes:
the loading module 901 is configured to load color YUV channel data and transparency A channel data of an image to be displayed into a memory;
a reading module 902 configured to read, by a graphics processor GPU, the YUV channel data and the a channel data of the image to be displayed from the memory, and generate RGBA channel data based on the read YUV channel data and the a channel data;
the rendering module 903 is configured to render and display the image to be displayed based on the RGBA channel data.
In some embodiments, the reading module 902 includes:
a first reading unit configured to generate RGB channel data based on the read YUV channel data;
and a generating unit configured to generate the RGBA channel data based on the read a channel data and the RGB channel data.
In some embodiments, the generating unit is configured to:
performing inverse compression transformation on the read A-channel data based on a target compression ratio to obtain A-channel data of each pixel point in the image to be displayed;
The read A channel data are obtained by compressing the A channel data of each pixel point of the image to be displayed based on the target compression ratio;
and generating RGBA channel data of each pixel point based on the A channel data and the RGB channel data of each pixel point.
In some embodiments, the loading module 901 includes:
a receiving unit configured to receive first JPEG data and second JPEG data of the image to be displayed, the first JPEG data carrying the YUV channel data, and the second JPEG data carrying the a channel data, which are transmitted by a server;
and the loading unit is configured to load the first JPEG data and the second JPEG data into the memory.
In some embodiments, the rendering module 903 includes:
the second reading unit is configured to read the first JPEG data and the second JPEG data of the image to be displayed from the memory through the GPU;
wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel, and the second JPEG data is single-channel data comprising a Y channel;
an acquisition unit configured to acquire the YUV channel data from a Y channel, a U channel, and a V channel of the first JPEG data;
The acquisition unit is further configured to acquire the a-channel data from a Y-channel of the second JPEG data.
According to the image processing device, color YUV channel data and transparency A channel data of an image to be displayed are loaded into a memory; reading YUV channel data and A channel data of an image to be displayed from a memory by a graphic processor GPU, and generating RGBA channel data required for rendering based on the read YUV channel data and A channel data; and rendering and displaying the image to be displayed based on the RGBA channel data. According to the method and the device, the YUV channel data and the A channel data are loaded into the memory instead of directly loading the RGBA data, and RGBA data required by rendering are generated by utilizing the GPU, so that memory overhead is greatly saved on the premise of guaranteeing display effect.
Fig. 10 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application. As shown in fig. 10, the apparatus includes:
an acquisition module 1001 configured to acquire RGBA channel data of an image to be displayed;
a first generation module 1002 configured to generate first JPEG data carrying YUV channel data, which is multi-channel data including Y-channel, U-channel, and V-channel, based on RGB channel data in the RGBA channel data;
A second generation module 1003 configured to generate second JPEG data carrying the a-channel data, which is single channel data including a Y-channel, based on transparency a-channel data in the RGBA-channel data;
the sending module 1004 is configured to send the first JPEG data and the second JPEG data of the image to be displayed to a terminal, so that the terminal loads the YUV channel data and the a channel data of the image to be displayed into a memory.
The image processing device provided by the embodiment of the application obtains RGBA channel data of an image to be displayed; generating first JPEG data carrying YUV channel data based on RGB channel data in the RGBA channel data, wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel; generating second JPEG data carrying the A-channel data based on transparency A-channel data in the RGBA-channel data, wherein the second JPEG data is single-channel data comprising a Y channel; and sending the first JPEG data and the second JPEG data of the image to be displayed to a terminal, so that the terminal loads the YUV channel data and the A channel data of the image to be displayed into a memory instead of directly loading RGBA data, and the memory overhead of the terminal is greatly saved.
The apparatus of the embodiments of the present application may perform the method provided by the embodiments of the present application, and implementation principles of the method are similar, and actions performed by each module in the apparatus of each embodiment of the present application correspond to steps in the method of each embodiment of the present application, and detailed functional descriptions of each module of the apparatus may be referred to in the corresponding method shown in the foregoing, which is not repeated herein.
Fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 11, the electronic device includes: a memory, a processor, and a computer program stored on the memory, the processor executing the above computer program to implement the steps of the image processing method, the steps being implementable in comparison with the related art:
according to the image processing method, color YUV channel data and transparency A channel data of an image to be displayed are loaded into a memory; reading YUV channel data and A channel data of an image to be displayed from a memory by a graphic processor GPU, and generating RGBA channel data required for rendering based on the read YUV channel data and A channel data; and rendering and displaying the image to be displayed based on the RGBA channel data. According to the method and the device, the YUV channel data and the A channel data are loaded into the memory instead of directly loading the RGBA data, and RGBA data required by rendering are generated by utilizing the GPU, so that memory overhead is greatly saved on the premise of guaranteeing display effect.
According to the image processing method, RGBA channel data of an image to be displayed are obtained; generating first JPEG data carrying YUV channel data based on RGB channel data in the RGBA channel data, wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel; generating second JPEG data carrying the A-channel data based on transparency A-channel data in the RGBA-channel data, wherein the second JPEG data is single-channel data comprising a Y channel; and sending the first JPEG data and the second JPEG data of the image to be displayed to a terminal, so that the terminal loads the YUV channel data and the A channel data of the image to be displayed into a memory instead of directly loading RGBA data, and the memory overhead of the terminal is greatly saved.
In an alternative embodiment, an electronic device is provided, as shown in fig. 11, the electronic device 1100 shown in fig. 11 includes: a processor 1101 and a memory 1103. The processor 1101 is coupled to a memory 1103, such as via a bus 1102. Optionally, the electronic device 1100 may further include a transceiver 1104, where the transceiver 1104 may be used for data interaction between the electronic device and other electronic devices, such as transmission of data and/or reception of data, etc. It should be noted that, in practical applications, the transceiver 1104 is not limited to one, and the structure of the electronic device 1100 is not limited to the embodiments of the present application.
The processor 1101 may be a CPU (Central Processing Unit ), general purpose processor, DSP (Digital Signal Processor, data signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field Programmable Gate Array, field programmable gate array) or other programmable logic device, transistor logic device, hardware components, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules, and circuits described in connection with this disclosure. The processor 1101 may also be a combination that performs computing functions, such as a combination comprising one or more microprocessors, a combination of a DSP and a microprocessor, or the like.
Bus 1102 may include a path that communicates information between the components. Bus 1102 may be a PCI (Peripheral Component Interconnect, peripheral component interconnect Standard) bus or an EISA (Extended Industry Standard Architecture ) bus, or the like. Bus 1102 may be divided into address bus, data bus, control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 11, but not only one bus or one type of bus.
The Memory 1103 may be a ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, a RAM (Random Access Memory ) or other type of dynamic storage device that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory ), a CD-ROM (Compact Disc Read Only Memory, compact disc Read Only Memory) or other optical disk storage, optical disk storage (including compact discs, laser discs, optical discs, digital versatile discs, blu-ray discs, etc.), magnetic disk storage media\othermagnetic storage devices, or any other medium that can be used to carry or store a computer program and that can be Read by a computer, without limitation.
The memory 1103 is used for storing a computer program for executing the embodiments of the present application, and is controlled to be executed by the processor 1101. The processor 1101 is configured to execute a computer program stored in the memory 1103 to implement the steps shown in the foregoing method embodiments.
Among them, electronic devices include, but are not limited to: a server, a terminal, or a cloud computing center device, etc.
Embodiments of the present application provide a computer readable storage medium having a computer program stored thereon, where the computer program, when executed by a processor, may implement the steps and corresponding content of the foregoing method embodiments.
The embodiments of the present application also provide a computer program product, which includes a computer program, where the computer program can implement the steps of the foregoing method embodiments and corresponding content when executed by a processor.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. The terms "comprises" and "comprising" as used in the embodiments of the present application mean that the corresponding features may be implemented as presented features, information, data, steps, operations, but do not exclude the implementation as other features, information, data, steps, operations, etc. supported by the state of the art.
The terms "first," "second," "third," "fourth," "1," "2," and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the present application described herein may be implemented in other sequences than those illustrated or otherwise described.
It should be understood that, although the flowcharts of the embodiments of the present application indicate the respective operation steps by arrows, the order of implementation of these steps is not limited to the order indicated by the arrows. In some implementations of embodiments of the present application, the implementation steps in the flowcharts may be performed in other orders as desired, unless explicitly stated herein. Furthermore, some or all of the steps in the flowcharts may include multiple sub-steps or multiple stages based on the actual implementation scenario. Some or all of these sub-steps or phases may be performed at the same time, or each of these sub-steps or phases may be performed at different times, respectively. In the case of different execution time, the execution sequence of the sub-steps or stages may be flexibly configured according to the requirement, which is not limited in the embodiment of the present application.
The foregoing is merely an optional implementation manner of the implementation scenario of the application, and it should be noted that, for those skilled in the art, other similar implementation manners based on the technical ideas of the application are adopted without departing from the technical ideas of the application, and also belong to the protection scope of the embodiments of the application.

Claims (10)

1. An image processing method, the method comprising:
the method comprises the steps of loading color YUV channel data and transparency A channel data of an image to be displayed into a memory;
reading the YUV channel data and the A channel data of the image to be displayed from the memory by a graphic processor GPU, and generating RGBA channel data based on the read YUV channel data and the A channel data;
and rendering and displaying the image to be displayed based on the RGBA channel data.
2. The method of claim 1, wherein the generating RGBA channel data based on the read YUV channel data and the a channel data comprises:
generating RGB channel data based on the read YUV channel data;
the RGBA channel data is generated based on the read a channel data and the RGB channel data.
3. The method of claim 2, wherein the generating the RGBA channel data based on the read a channel data and the RGB channel data comprises:
performing inverse compression transformation on the read A-channel data based on a target compression ratio to obtain A-channel data of each pixel point in the image to be displayed;
the read A channel data are obtained by compressing the A channel data of each pixel point of the image to be displayed based on the target compression ratio;
and generating RGBA channel data of each pixel point based on the A channel data and the RGB channel data of each pixel point.
4. The method according to claim 1, wherein loading the color YUV channel data and the transparency a channel data of the image to be displayed into the memory comprises:
receiving first JPEG data and second JPEG data of the image to be displayed, wherein the first JPEG data carries the YUV channel data, and the second JPEG data carries the A channel data, which are sent by a server;
and loading the first JPEG data and the second JPEG data into the memory.
5. The method according to claim 1 or 4, wherein the reading, by a graphics processor GPU, the YUV channel data and the a channel data of the image to be displayed from the memory comprises:
Reading the first JPEG data and the second JPEG data of the image to be displayed from the memory through the GPU;
wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel, and the second JPEG data is single-channel data comprising a Y channel;
acquiring the YUV channel data from a Y channel, a U channel and a V channel of the first JPEG data;
and acquiring the A channel data from the Y channel of the second JPEG data.
6. An image processing method, the method comprising:
RGBA channel data of an image to be displayed are obtained;
generating first JPEG data carrying YUV channel data based on RGB channel data in the RGBA channel data, wherein the first JPEG data is multi-channel data comprising a Y channel, a U channel and a V channel;
generating second JPEG data carrying the A-channel data based on transparency A-channel data in the RGBA-channel data, wherein the second JPEG data is single-channel data comprising a Y channel;
and sending the first JPEG data and the second JPEG data of the image to be displayed to a terminal, so that the terminal loads the YUV channel data and the A channel data of the image to be displayed into a memory.
7. An image processing apparatus, characterized in that the apparatus comprises:
the loading module is configured to load the color YUV channel data and the transparency A channel data of the image to be displayed into the memory;
a reading module configured to read, by a graphics processor GPU, the YUV channel data and the a channel data of the image to be displayed from the memory, and generate RGBA channel data based on the read YUV channel data and the a channel data;
and the rendering module is configured to render and display the image to be displayed based on the RGBA channel data.
8. An image processing apparatus, characterized in that the apparatus comprises:
an acquisition module configured to acquire RGBA channel data of an image to be displayed;
a first generation module configured to generate first JPEG data carrying YUV channel data based on RGB channel data in the RGBA channel data, the first JPEG data being multi-channel data including a Y channel, a U channel, and a V channel;
a second generation module configured to generate second JPEG data carrying the a-channel data based on transparency a-channel data in the RGBA-channel data, the second JPEG data being single channel data including a Y-channel;
And the sending module is configured to send the first JPEG data and the second JPEG data of the image to be displayed to a terminal so that the terminal loads the YUV channel data and the A channel data of the image to be displayed into a memory.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory, characterized in that the processor executes the computer program to implement the image processing method of any one of claims 1 to 6.
10. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the image processing method of any one of claims 1 to 6.
CN202311492984.5A 2023-11-09 2023-11-09 Image processing method, device, electronic equipment and storage medium Pending CN117611712A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311492984.5A CN117611712A (en) 2023-11-09 2023-11-09 Image processing method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311492984.5A CN117611712A (en) 2023-11-09 2023-11-09 Image processing method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117611712A true CN117611712A (en) 2024-02-27

Family

ID=89943435

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311492984.5A Pending CN117611712A (en) 2023-11-09 2023-11-09 Image processing method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117611712A (en)

Similar Documents

Publication Publication Date Title
RU2667723C2 (en) Method for encoding and method for decoding a lut and corresponding devices
KR102617258B1 (en) Image processing method and apparatus
US20210377542A1 (en) Video encoding and decoding method, device, and system, and storage medium
CN106488141A (en) HDR is to the method for HDR inverse tone mapping (ITM), system and equipment
CN110971931A (en) Video watermark adding method and device, electronic equipment and storage medium
CN105959724B (en) Video data processing method and device
CN109831668B (en) Data compression method and device, data coding/decoding method and device
CN112087648B (en) Image processing method, image processing device, electronic equipment and storage medium
CN113041617B (en) Game picture rendering method, device, equipment and storage medium
US20230252758A1 (en) Image processing method and apparatus, electronic device, program, and readable storage medium
CN110830804A (en) Method and apparatus for signaling picture/video format
CN111918065A (en) Information compression/decompression method and device
CN108471536B (en) Alpha channel transmission method and device, terminal device and storage medium
CN114040246A (en) Image format conversion method, device, equipment and storage medium of graphic processor
US20120218292A1 (en) System and method for multistage optimized jpeg output
US10573279B2 (en) Systems and methods for combining video and graphic sources for display
CN112653905B (en) Image processing method, device, equipment and storage medium
US10423587B2 (en) Systems and methods for rendering graphical assets
CN115278301B (en) Video processing method, system and equipment
CN117611712A (en) Image processing method, device, electronic equipment and storage medium
CN114245027B (en) Video data hybrid processing method, system, electronic equipment and storage medium
CN108933945B (en) GIF picture compression method, device and storage medium
CN113938572A (en) Picture transmission method, display method, device, electronic equipment and storage medium
KR20040098652A (en) Image processing apparatus and method
CN116824007A (en) Animation playing method, animation generating device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination