CN108881920B - Method, terminal and server for transmitting video information - Google Patents

Method, terminal and server for transmitting video information Download PDF

Info

Publication number
CN108881920B
CN108881920B CN201710330725.0A CN201710330725A CN108881920B CN 108881920 B CN108881920 B CN 108881920B CN 201710330725 A CN201710330725 A CN 201710330725A CN 108881920 B CN108881920 B CN 108881920B
Authority
CN
China
Prior art keywords
data
server
channel data
alpha
channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710330725.0A
Other languages
Chinese (zh)
Other versions
CN108881920A (en
Inventor
杨帆
荆彦青
魏学峰
曹文升
耿天平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201710330725.0A priority Critical patent/CN108881920B/en
Priority to PCT/CN2018/085445 priority patent/WO2018205878A1/en
Publication of CN108881920A publication Critical patent/CN108881920A/en
Application granted granted Critical
Publication of CN108881920B publication Critical patent/CN108881920B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/40Network security protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/762Media network packet handling at the source 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/764Media network packet handling at the destination 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Abstract

The embodiment of the invention discloses a method for transmitting video data, which is used for decoding YUV channel data and Alpha channel information of data received by a server to obtain RGB format data, and processing the RGB format data by using the Alpha channel information, so that a background image of a displayed target image can be adjusted, a foreground image of the target image is better attached to the whole background displayed by a display, and a user can also obtain better immersion. The method comprises the following steps: the method comprises the steps that a terminal obtains image data of a target object, wherein the image data comprises RGB format data and Alpha channel information; the terminal carries out encoding processing on the RGB format data to obtain YUV channel data; the terminal adds the Alpha channel information into the YUV channel data to obtain coded data; the terminal transmits the encoded data to the server, and the encoded data is used for decoding and displaying the image of the target object by the server.

Description

Method, terminal and server for transmitting video information
Technical Field
The present invention relates to the field of computers, and in particular, to a method, a terminal, and a server for transmitting video information.
Background
In the existing video transmission technology, a terminal performs transcoding and transcoding processing of RGBA through color data acquired by a camera, where RGBA is a color space representing Red (Red), Green (Green), Blue (Blue), and Alpha. In the processing process, Alpha channel information is lost, and only the transcoding and code pressing of an RGB channel are carried out, and the transcoding is converted into YUV data. And compressing the YUV data to perform network broadcasting during live video broadcasting. After the user receives the compressed file transmitted by the network, the user only obtains the RGB data for display and presentation when decoding and transcoding the data.
However, since the Alpha information channel is not stored in the transcoding process, the background in the anchor image can only be shot and displayed, and subsequent background removal, Alpha mixing processing and template processing cannot be performed.
Disclosure of Invention
The embodiment of the invention provides a method, a terminal and a server for transmitting video information, which are used for realizing the storage of Alpha channel information by adding and recombining the Alpha channel information, the data received by the server comprises YUV channel data and the Alpha channel information, the YUV channel data is decoded to obtain RGB format data, and the RGB format data is processed by using the Alpha channel information, so that a background image of a displayed target image can be adjusted, the foreground image of the target image is better attached to the whole background displayed by a display, and a user can also obtain better immersion.
The encoded data is decoded because of the subsequent processing after the image decoding and playing.
A first aspect of an embodiment of the present invention provides a method for transmitting video data, where the method may include:
the method comprises the steps that a terminal obtains image data of a target object, wherein the image data comprises RGB format data and Alpha channel information;
the terminal carries out coding processing on the RGB format data to obtain YUV channel data;
the terminal adds the Alpha channel information into the YUV channel data to obtain coded data;
the terminal transmits the encoded data to a server, the encoded data being used for the server to decode and display the image of the target object.
A second aspect of the embodiments of the present invention provides a method for transmitting video data, which may include:
the method comprises the steps that a server receives coded data sent by a terminal, wherein the coded data comprise YUV channel data and Alpha channel information;
the server decodes the YUV channel data to obtain RGB format data;
the server displays an image of a target object according to the RGB format data and the Alpha information.
A third aspect of the embodiments of the present invention provides a terminal, which may include:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring image data of a target object, and the image data comprises RGB format data and Alpha channel information;
the encoding module is used for encoding the RGB format data to obtain YUV channel data;
the adding module is used for adding the Alpha channel information into the YUV channel data to obtain coded data;
and the sending module is used for sending the coded data to a server, and the coded data is used for decoding and displaying the image of the target object by the server.
A fourth aspect of the embodiments of the present invention provides a server, which may include:
the receiving module is used for receiving the coded data sent by the terminal, and the coded data comprises YUV channel data and Alpha channel information;
the decoding module is used for decoding the YUV channel data to obtain RGB format data;
and the display module is used for displaying the image of the target object according to the RGB format data and the Alpha information.
According to the technical scheme, the embodiment of the invention has the following advantages:
in the embodiment of the invention, a terminal acquires image data of a target object, wherein the image data comprises RGB format data and Alpha channel information; the terminal carries out coding processing on the RGB format data to obtain YUV channel data; the terminal adds the Alpha channel information into the YUV channel data to obtain coded data; the terminal transmits the encoded data to a server, the encoded data being used for the server to decode and display the image of the target object. The method comprises the steps that a server receives coded data sent by a terminal, wherein the coded data comprise YUV channel data and Alpha channel information; the server decodes the YUV channel data to obtain RGB format data; the server displays an image of a target object according to the RGB format data and the Alpha information. Because the coded data sent by the terminal not only include YUV channel data but also include Alpha channel information, the server receives the coded data, can decode the YUV channel data to obtain RGB format data, and then performs image background elimination, Alpha blending processing or template processing and the like on the RGB format data through the Alpha channel information, so that a background image of a target image can be adjusted, a foreground image of the target image is better attached to the whole background displayed by the display, and a user has better immersion.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the following briefly introduces the embodiments and the drawings used in the description of the prior art, and obviously, the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained according to the drawings.
FIG. 1 is a diagram illustrating a prior art display of a target image;
FIG. 2 is a schematic structural diagram of an embodiment of the present invention;
fig. 3 is a schematic diagram of an embodiment of a method for transmitting video information according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a memory for YUV channel data according to an embodiment of the present invention;
fig. 4(a) is a schematic diagram of a memory for performing additional recombination of Alpha channel information at an upper end position of YUV channel data in the embodiment of the present invention;
fig. 4(b) is a schematic diagram of a memory for performing additional recombination of Alpha channel information at a lower end position of YUV channel data in the embodiment of the present invention;
fig. 4(c) is a schematic diagram of a memory for performing additional recombination of Alpha channel information at a left position of YUV channel data according to an embodiment of the present invention;
fig. 4(d) is a schematic diagram of a memory for performing additional recombination of Alpha channel information at the right side position of YUV channel data according to an embodiment of the present invention;
FIG. 5(a) is a diagram of an original image according to an embodiment of the present invention;
fig. 5(b) is a schematic diagram of RGB transcoding an original image according to an embodiment of the present invention;
fig. 5(c) is a schematic diagram of adding Alpha data at the upper end of YUV channel data according to the embodiment of the present invention;
fig. 5(d) is a schematic diagram of adding Alpha data at the lower end of YUV channel data according to the embodiment of the present invention;
fig. 5(e) is a schematic diagram of adding Alpha data at the left position of YUV channel data according to the embodiment of the present invention;
FIG. 5(f) is a diagram illustrating the addition of Alpha data at the right side of YUV channel data according to the embodiment of the present invention;
FIG. 5(g) is an exemplary diagram of Alpha channel information suffix recovery in an embodiment of the present invention;
FIG. 5(h) is a diagram illustrating the display of the target image on another terminal in the prior art;
FIG. 5(i) is a schematic diagram of the target image displayed on another terminal in an embodiment of the present invention;
FIG. 6 is a schematic illustration of an experience of an embodiment of the present invention;
FIG. 7 is a timing diagram of one embodiment of the present invention;
FIG. 8 is a diagram of a terminal according to an embodiment of the invention;
FIG. 9 is a diagram of one embodiment of a server, in accordance with the present invention;
FIG. 10 is a diagram of a terminal according to an embodiment of the invention;
fig. 11 is a schematic diagram of an embodiment of a server according to the present invention.
Detailed Description
The embodiment of the invention provides a method, a terminal and a server for transmitting video information, which are used for storing and transmitting Alpha channel information, and adjusting a background image of a displayed target image by processing RGB format data through the Alpha channel information, so that a foreground image of the target image is better attached to the whole background displayed by a display, and a user can obtain better immersion.
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The embodiments based on the present invention should fall into the protection scope of the present invention.
The abbreviations and key terms referred to in the embodiments of the present application are described briefly as follows:
alpha channel: the Alpha channel is a gray channel in color, which is used to record transparency information in an image, i.e., the Alpha channel is an 8-bit gray channel, which records transparency information in an image with 256 levels of gray, defining transparent, opaque, and translucent areas, where black indicates transparent, white indicates opaque, and gray indicates translucent.
QQTALK: QT is a team voice communication tool supporting multi-person voice communication, is mainly oriented to game users needing team cooperation, and is also suitable for office and home users needing multi-person voice communication. The software is small and exquisite, flexible, simple to operate, excellent in performance, capable of penetrating through a firewall with extremely small bandwidth occupation and providing clear and high-quality voice service.
YUV: one color coding method (Phase Alternation Line, PAL) adopted by the european television system is the color space adopted by PAL and the Sequential Couleur Avec Memoire (SECAM) analog color television system. In modern color television systems, a three-tube color camera or a color CCD camera is usually used for image capture, then the obtained color image signals are subjected to color separation and respective amplification and correction to obtain RGB, and then a luminance signal Y and two color difference signals B-Y (i.e., U) and R-Y (i.e., V) are obtained through a matrix conversion circuit, and finally a transmitting end respectively encodes the luminance signal and the color difference signals and transmits the encoded signals through the same channel. This color representation is called YUV color space representation. The importance of using the YUV color space is that its luminance signal Y and chrominance signal U, V are separate.
RGBA: is a color space representing Red (Red) Green (Blue) and Alpha.
RGB: the RGB color scheme is a color standard in the industry, and various colors are obtained by changing three color channels of red (R), green (G) and blue (B) and superimposing the three color channels on each other, where RGB represents colors of the three channels of red, green and blue, and the color standard almost includes all colors that can be perceived by human vision, and is one of the most widely used color systems at present.
In the video transmission technology in the existing QT product, the color data acquired by the terminal through the camera is transcoded and code-pressing processed by RGBA, Alpha channel information can be lost in the processing process, only the transcoding and code-pressing of an RGB channel can be carried out, and the transcoding is YUV data. And compressing the YUV data to perform network broadcasting during live video broadcasting. After the user receives the compressed file transmitted by the network, the RGB data are only obtained for display and presentation when the data are decoded and transcoded. Where RGBA is a color space representing Red (Red), Green (Green), Blue (Blue), and Alpha.
However, since the Alpha channel information is not stored in the transcoding process, the background in the anchor image can only be displayed as shot, and subsequent background removal, Alpha mixing processing and template processing cannot be performed. As shown in fig. 1, which is a schematic view of a conventional display target image, it can be seen that the background of the beauty anchor in the figure is a pattern, which does not fit well with the whole background displayed on the display, and the user looks more obtrusive in sense.
In the technical solution of the present invention, as shown in fig. 2, it is a schematic structural diagram applied in the embodiment of the present invention. Wherein, the camera of each player environment is connected with a Personal Computer (PC)/client, and the intelligent terminal of the player is connected with the server through a network. The Alpha channel information is saved by adding and recombining the Alpha channel information, and the subsequent processing during image decoding and playing, such as background elimination, Alpha mixing processing, template processing and the like, can be facilitated; secondly, the Alpha channel information of each frame of image does not need to be separately stored into a picture sequence or pressed into a separate video; finally, the Alpha channel information of each frame of image can be directly acquired when each frame of image is decoded, and extra searching and synchronous processing are not needed.
The following further describes an embodiment of the present invention by way of an embodiment, and as shown in fig. 3, is a schematic diagram of an embodiment of a method for transmitting video information in an embodiment of the present invention, including:
301. the method comprises the steps that a terminal obtains image data of a target object, wherein the image data comprises RGB format data and Alpha channel information;
in the embodiment of the invention, the terminal acquires the image data of the target object, wherein the image data comprises RGB format data and Alpha channel information. Firstly, after a game is started, a player selects a room to start the game, and meanwhile, a camera starts to be initialized on a terminal; the terminal acquires image data of a user, wherein the image data is RGBA format data, and the RGBA is channel data representing Red (Red), Green (Green), Blue (Blue) and Alpha.
In practical application, a camera is connected to the terminal, the terminal can acquire image data of each player through the camera, the image data is RGBA format data, and the RGBA format data is RGB format data and Alpha channel information. The terminal may be any terminal device such as a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), and a vehicle-mounted computer.
302. The terminal carries out encoding processing on the RGB format data to obtain YUV channel data;
in the embodiment of the invention, after the terminal acquires the image data of the target object, the image data comprises RGB format data and Alpha channel information, the terminal encodes the RGB format data to obtain YUV channel data.
It should be understood that the terminal acquires the image data of the target object, and displays the target object according to the image data locally at the terminal, and meanwhile, the RGB format data in the image data can be encoded, because the data provided by the camera is in RGBA format, and the terminal processes each data channel in the data respectively. The RGB channel image information terminal in RGBA will be transcoded into YUV format data such as NV21, NV12, YUV420, etc. according to the coding requirement.
303. The terminal adds the Alpha channel information into the YUV channel data to obtain coded data;
in the embodiment of the invention, after the terminal codes RGB format data to obtain YUV channel data, the terminal adds Alpha channel information to the YUV channel data to obtain coded data. Specifically, the method may include the following steps:
(1) the terminal adds the Alpha channel information to the upper end position of the YUV channel data, so that a background image displayed on the server is displayed at the upper end position of the foreground image, wherein the background image corresponds to the Alpha channel information, and the foreground image corresponds to the YUV channel data.
(2) The terminal adds the Alpha channel information to the lower end position of the YUV channel data, so that a background image displayed on the server is displayed at the lower end position of a foreground image, wherein the background image corresponds to the Alpha information, and the foreground image corresponds to the YUV channel data.
(3) The terminal adds the Alpha channel information to the left position of the YUV channel data, so that a background image displayed on the server is displayed at the left position of the foreground image, wherein the background image corresponds to the Alpha information, and the foreground image corresponds to the YUV channel data.
(4) And the terminal adds the Alpha channel information to the right position of the YUV channel data, so that a background image displayed on the server is displayed at the right position of the foreground image, wherein the background image corresponds to the Alpha information, and the foreground image corresponds to the YUV channel data.
That is, Alpha channel information in RGBA, the terminal may select according to the foregoing transcoding format and encoder format, and add it to the foregoing YUV channel data, where fig. 4 is a schematic memory diagram of the YUV channel data. There are many additional ways to obtain encoded data, which are described above by way of example only and are: as shown in fig. 4(a), a schematic diagram of an internal memory for performing additional recombination of Alpha channel information at an upper end position of YUV channel data; as shown in fig. 4(b), a schematic diagram of an internal memory for performing additional recombination of Alpha channel information at a lower end position of YUV channel data; as shown in fig. 4(c), a schematic memory diagram of performing additional recombination of Alpha channel information at the left position of the YUV channel data; as shown in fig. 4(d), a schematic diagram of a memory for performing additional recombination of Alpha channel information at the right position of YUV channel data is shown.
It should be noted that the Alpha channel information may be divided into 3 parts, which are respectively added to the upper end position of the Y channel data, the upper end position of the U channel data, and the upper end position of the V channel data; or, the data are respectively added at the lower end position of the Y channel data, the lower end position of the U channel data and the lower end position of the V channel data; or, the data are respectively added at the left position of the Y-channel data, the left position of the U-channel data and the left position of the V-channel data; and respectively added at the right position of the Y-channel data, the right position of the U-channel data and the right position of the V-channel data.
The following can explain the addition of Alpha channel information in YUV images in combination with specific RGBA images, as shown in fig. 5(a), which is a schematic diagram of an original image; firstly, transcoding the RGB of the original image, as shown in fig. 5(b), which is a schematic diagram of the original image after being transcoded by the RGB; then, the terminal can add Alpha information to the transcoded image to complete final YUV reconstruction, and the four data reconstructions mentioned above respectively correspond to the schematic diagrams as follows: as shown in fig. 5(c), it is a schematic diagram of adding Alpha data at the upper end position of YUV channel data; as shown in fig. 5(d), a schematic diagram of adding Alpha data at the lower end position of YUV channel data; as shown in fig. 5(e), it is a schematic diagram of adding Alpha data at the left position of YUV channel data; as shown in fig. 5(f), Alpha data is added to the right side of the YUV channel data.
It should be understood that, here, the terminal may perform subsequent conventional compression after obtaining the YUV channel image data, but the width and height of the YUV channel data may be correspondingly expanded due to the storage of Alpha information, and at this time, only the corresponding parameters of the encoding need to be adjusted.
304. The terminal sends the coded data to the server;
in an embodiment of the present invention, step 304 may include step a and step b, as follows:
a. the terminal transmits the encoded data to the server, and the encoded data is used for decoding and displaying the image of the target object by the server.
b. And the server receives the coded data sent by the terminal, wherein the coded data comprises YUV channel data and Alpha channel information.
305. The server decodes the YUV channel data to obtain RGB format data;
in the embodiment of the invention, because the coded data received by the server comprises YUV channel data and Alpha channel information, the server decodes the YUV channel data to obtain RGB format data.
In this process, the server decodes and transcodes the YUV channel data, the server may extract the Alpha channel information after acquiring the encoded data, and then fill the extracted Alpha channel information into the transcoded RGB format data, at this time, the complete RGBA recovery process is completed, and a suffix recovery example of the Alpha channel information is shown in fig. 5 (g).
306. The server displays an image of the target object on the display according to the RGB format data and Alpha information.
In the embodiment of the invention, after the server decodes the YUV channel data to obtain RGB format data, the server displays the image of the target object on the display according to the RGB format data and Alpha information. Specifically, the method may include: the display module is specifically configured to fill the RGB format data according to the Alpha information to obtain display data; and displaying the image of the target object according to the display data. In the prior art, the schematic diagram of the target image displayed on the display screen of another terminal is shown in fig. 5(h), and the schematic diagram of the target image displayed after the technical solution of the present invention is shown in fig. 5 (i).
The prior art cannot store Alpha channel information, and the scheme can realize the storage of the Alpha channel information by the additional recombination of the Alpha channel information, and can facilitate the subsequent processing after the image is decoded and played, such as background elimination, Alpha mixing processing, template processing and the like; secondly, the Alpha of each frame of image does not need to be stored into a picture sequence or pressed into a single video; finally, the Alpha information of each frame of image can be directly acquired when each frame of image is decoded, and extra searching and synchronous processing are not needed.
In the embodiment of the invention, a terminal acquires image data of a target object, wherein the image data comprises RGB format data and Alpha channel information; the terminal carries out coding processing on the RGB format data to obtain YUV channel data; the terminal adds the Alpha channel information into the YUV channel data to obtain coded data; the terminal transmits the encoded data to a server, the encoded data being used for the server to decode and display the image of the target object. The method comprises the steps that a server receives coded data sent by a terminal, wherein the coded data comprise YUV channel data and Alpha channel information; the server decodes the YUV channel data to obtain RGB format data; the server displays an image of a target object according to the RGB format data and the Alpha information. Because the coded data sent by the terminal not only include YUV channel data but also include Alpha channel information, the server receives the coded data, can decode the YUV channel data to obtain RGB format data, and then performs image background elimination, Alpha blending processing or template processing and the like on the RGB format data through the Alpha channel information, so that a background image of a target image can be adjusted, a foreground image of the target image is better attached to the whole background displayed by the display, and a user has better immersion.
The technical solution of the present invention can be further explained in the following practical application scenarios, as shown below:
as shown in fig. 6, which is an experience schematic diagram of the technical solution of the present invention, a basic experience flow of the technical solution of the present invention is as follows:
starting a game client → connecting a 3D motion sensing camera → calibrating a character → entering a game scene → storing Alpha channel data of a local camera and then transcoding and compressing the data → sending the data to a player playing the same game → decoding and transcoding the data → mixing and displaying the Alpha channel data.
In the product form shown in fig. 6, the following points are required:
in the three characters displayed in fig. 6, the leftmost character is the local player image, the rightmost side is the network-end co-playing player, the network-end player and the anchor can have complete Alpha information presentation as the local player, so that the server can conveniently perform background elimination and Alpha mixing processing, and the character display has immersion feeling. The local player image in the upper figure is directly acquired by a camera, and the image of the network player is an image which is transmitted by the server to be decoded, transcoded and displayed.
Fig. 7 is a schematic diagram of a timing diagram according to an embodiment of the present invention, and it should be understood that the description is made in conjunction with the player image shown in fig. 6. The method comprises the steps that a leftmost character is used for explaining, a camera of a client corresponding to the leftmost character obtains image data of the character, image processing is carried out on RGB format data in the image data of the character, specifically, the client carries out transcoding coding on the RGB format data to obtain YUV channel data, and Alpha channel information is added into the YUV channel data and sent to a game server. The game server receives data including YUV channel data and Alpha channel information, decodes the YUV channel data to obtain RGB format data and obtain Alpha channel information, and fills the RGB format data according to the Alpha channel information to form a final image for displaying on a client of a player playing the same game.
The above description is made on the method for transmitting video information in the embodiment of the present application, and a terminal and a server in the embodiment of the present application are described below, as shown in fig. 8, which is a schematic diagram of an embodiment of a terminal in the embodiment of the present invention, and the method includes:
an obtaining module 801, configured to obtain image data of a target object, where the image data includes RGB format data and Alpha channel information;
the encoding module 802 is configured to perform encoding processing on RGB format data to obtain YUV channel data;
an adding module 803, configured to add Alpha channel information to the YUV channel data to obtain encoded data;
and a sending module 804, configured to send the encoded data to the server, where the encoded data is used for the server to decode and display the image of the target object.
Alternatively, in some embodiments of the present invention,
the adding module 803 is specifically configured to add the Alpha channel information to the upper end position of the YUV channel data, so that a background image displayed on the server is displayed at the upper end position of the foreground image, where the background image corresponds to the Alpha channel information and the foreground image corresponds to the YUV channel data.
Alternatively, in some embodiments of the present invention,
the adding module 803 is specifically configured to add the Alpha channel information to the lower end position of the YUV channel data, so that a background image displayed on the server is displayed at the lower end position of the foreground image, where the background image corresponds to the Alpha information and the foreground image corresponds to the YUV channel data.
Alternatively, in some embodiments of the present invention,
the adding module 803 is specifically configured to add the Alpha channel information to the left position of the YUV channel data, so that a background image displayed on the server is displayed at the left position of the foreground image, where the background image corresponds to the Alpha information, and the foreground image corresponds to the YUV channel data.
Alternatively, in some embodiments of the present invention,
the adding module 803 is specifically configured to add the Alpha channel information to the right position of the YUV channel data, so that a background image displayed on the server is displayed at the right position of the foreground image, where the background image corresponds to the Alpha information and the foreground image corresponds to the YUV channel data.
As shown in fig. 9, which is a schematic diagram of an embodiment of a server in an embodiment of the present invention, the schematic diagram includes:
a receiving module 901, configured to receive encoded data sent by a terminal, where the encoded data includes YUV channel data and Alpha channel information;
a decoding module 902, configured to decode the YUV channel data to obtain RGB format data;
and a display module 903 for displaying the image of the target object according to the RGB format data and Alpha information.
Alternatively, in some embodiments of the present invention,
a display module 903, specifically configured to fill the RGB format data according to the Alpha information, and obtain display data; an image of the target object is displayed according to the display data.
As shown in fig. 10, which is a schematic diagram of an embodiment of a terminal in an embodiment of the present invention, the schematic diagram includes:
for convenience of explanation, a terminal is taken as a mobile phone as an example, and only a part related to the embodiment of the present invention is shown, and details of the specific technology are not disclosed, please refer to the method part of the embodiment of the present invention.
Fig. 10 is a block diagram showing a partial structure of a cellular phone related to a terminal provided by an embodiment of the present invention. Referring to fig. 10, the cellular phone includes: radio Frequency (RF) circuit 1010, memory 1020, input unit 1030, display unit 1040, sensor 1050, audio circuit 1060, wireless fidelity (WiFi) module 1070, processor 1080, and power source 1090. Those skilled in the art will appreciate that the handset configuration shown in fig. 10 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 10:
RF circuit 1010 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for processing downlink information of a base station after receiving the downlink information to processor 1080; in addition, the data for designing uplink is transmitted to the base station. In general, RF circuit 1010 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 1010 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 1020 can be used for storing software programs and modules, and the processor 1080 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1020. The memory 1020 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1020 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1030 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1030 may include a touch panel 1031 and other input devices 1032. The touch panel 1031, also referred to as a touch screen, may collect touch operations by a user (e.g., operations by a user on or near the touch panel 1031 using any suitable object or accessory such as a finger, a stylus, etc.) and drive corresponding connection devices according to a preset program. Optionally, the touch panel 1031 may include two parts, namely a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1080, and can receive and execute commands sent by the processor 1080. In addition, the touch panel 1031 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 1030 may include other input devices 1032 in addition to the touch panel 1031. In particular, other input devices 1032 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, a joystick, or the like.
The display unit 1040 may be used to display information input by a user or information provided to the user and various menus of the cellular phone. The Display unit 1040 may include a Display panel 1041, and optionally, the Display panel 1041 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 1031 can cover the display panel 1041, and when the touch panel 1031 detects a touch operation on or near the touch panel 1031, the touch operation is transmitted to the processor 1080 to determine the type of the touch event, and then the processor 1080 provides a corresponding visual output on the display panel 1041 according to the type of the touch event. Although in fig. 10, the touch panel 1031 and the display panel 1041 are two separate components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1031 and the display panel 1041 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1050, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1041 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1041 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 1060, speaker 1061, microphone 1062 may provide an audio interface between the user and the handset. The audio circuit 1060 can transmit the electrical signal converted from the received audio data to the speaker 1061, and the electrical signal is converted into a sound signal by the speaker 1061 and output; on the other hand, the microphone 1062 converts the collected sound signal into an electrical signal, which is received by the audio circuit 1060 and converted into audio data, which is then processed by the audio data output processor 1080 and then sent to, for example, another cellular phone via the RF circuit 1010, or output to the memory 1020 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help the user to send and receive e-mail, browse web pages, access streaming media, etc. through the WiFi module 1070, which provides wireless broadband internet access for the user. Although fig. 10 shows the WiFi module 1070, it is understood that it does not belong to the essential constitution of the handset, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 1080 is a control center of the mobile phone, connects various parts of the whole mobile phone by using various interfaces and lines, and executes various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1020 and calling data stored in the memory 1020, thereby integrally monitoring the mobile phone. Optionally, processor 1080 may include one or more processing units; preferably, the processor 1080 may integrate an application processor, which handles primarily the operating system, user interfaces, applications, etc., and a modem processor, which handles primarily the wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 1080.
The handset also includes a power source 1090 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 1080 via a power management system to manage charging, discharging, and power consumption via the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In the embodiment of the present invention, the processor 1080 included in the terminal is configured to execute the steps executed by the encoding module 802 and the adding module 803 in the embodiment shown in fig. 8, specifically, perform encoding processing on RGB format data to obtain YUV channel data; adding Alpha channel information into YUV channel data to obtain coded data;
the RF circuit 1010 is configured to perform the steps performed by the obtaining module 801 and the sending module 804 in the embodiment shown in fig. 8, specifically, to obtain image data of the target object, where the image data includes RGB format data and Alpha channel information; and sending the coded data to the server, wherein the coded data is used for decoding and displaying the image of the target object by the server.
Alternatively, in some embodiments of the present invention,
the processor 1080 is specifically configured to add Alpha channel information to an upper end position of the YUV channel data, so that a background image displayed on the server is displayed at an upper end position of the foreground image, where the background image corresponds to the Alpha channel information and the foreground image corresponds to the YUV channel data.
Alternatively, in some embodiments of the present invention,
the processor 1080 is specifically configured to add the Alpha channel information to the lower end position of the YUV channel data, so that a background image displayed on the server is displayed at the lower end position of the foreground image, where the background image corresponds to the Alpha information and the foreground image corresponds to the YUV channel data.
Alternatively, in some embodiments of the present invention,
the processor 1080 is specifically configured to add the Alpha channel information to the left position of the YUV channel data, so that the background image displayed on the server is displayed at the left position of the foreground image, where the background image corresponds to the Alpha information, and the foreground image corresponds to the YUV channel data.
Alternatively, in some embodiments of the present invention,
the processor 1080 is specifically configured to add the Alpha channel information to the right position of the YUV channel data, so that a background image displayed on the server is displayed at the right position of the foreground image, where the background image corresponds to the Alpha information and the foreground image corresponds to the YUV channel data.
As shown in fig. 11, which is a schematic diagram of an embodiment of a server in an embodiment of the present invention, including:
the server 1100 may vary widely in configuration or performance and may include one or more Central Processing Units (CPUs) 1122 (e.g., one or more processors) and memory 1132, one or more storage media 1130 (e.g., one or more mass storage devices) storing applications 1142 or data 1144. Memory 1132 and storage media 1130 may be, among other things, transient storage or persistent storage. The program stored on the storage medium 1130 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Still further, the central processor 1122 may be provided in communication with the storage medium 1130 to execute a series of instruction operations in the storage medium 1130 on the server 1100.
The server 1100 may also include one or more power supplies 1126, one or more wired or wireless network interfaces 1150, one or more input-output interfaces 1158, and/or one or more operating systems 1141, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, and so forth.
In the embodiment of the present invention, the central processor 1122 is configured to execute the steps executed by the decoding module 902 and the display module 903 in the embodiment shown in fig. 9, specifically, decode YUV channel data to obtain RGB format data; displaying an image of the target object according to the RGB format data and the Alpha information;
the input/output interface 1158 is configured to execute the steps executed by the receiving module 901 in the embodiment shown in fig. 9, specifically, to receive encoded data sent by a terminal, where the encoded data includes YUV channel data and Alpha channel information;
alternatively, in some embodiments of the invention,
the central processor 1122 is specifically configured to fill the RGB format data according to the Alpha information, and acquire display data; an image of the target object is displayed according to the display data.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention, which is substantially or partly contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (8)

1. A method for transmitting video data, comprising:
the method comprises the steps that a terminal obtains image data of a target object, wherein the image data comprises RGB format data and Alpha channel information;
the terminal carries out encoding processing on the RGB format data to obtain YUV channel data;
the terminal adds the Alpha channel information to the YUV channel data to obtain coded data;
the terminal sends the coded data to a server so that the server can extract the Alpha channel information from the coded data and decode the YUV channel data to obtain RGB format data, wherein the coded data is used for decoding and displaying the image of the target object by the server;
wherein, the terminal adds the Alpha channel information to the YUV channel data, including:
dividing the Alpha channel information into 3 parts, and respectively adding the 3 parts to the upper end position of the Y channel data, the upper end position of the U channel data and the upper end position of the V channel data, so that a background image displayed on the server is displayed at the upper end position of a foreground image;
or dividing the Alpha channel information into 3 parts, and respectively adding the 3 parts to the lower end position of the Y channel data, the lower end position of the U channel data and the lower end position of the V channel data, so that a background image displayed on the server is displayed at the lower end position of a foreground image;
or dividing the Alpha channel information into 3 parts, and respectively adding the parts to the left position of the Y channel data, the left position of the U channel data and the left position of the V channel data, so that the background image displayed on the server is displayed at the left position of the foreground image;
or dividing the Alpha channel information into 3 parts, and adding the 3 parts to the right position of the Y channel data, the right position of the U channel data and the right position of the V channel data respectively, so that the background image displayed on the server is displayed at the right position of the foreground image.
2. The method according to claim 1, wherein the background image corresponds to the Alpha channel information and the foreground image corresponds to the YUV channel data.
3. A method for transmitting video data, comprising:
the method comprises the steps that a server receives coded data sent by a terminal, wherein the coded data comprise YUV channel data and Alpha channel information; the coded data is obtained by adding the Alpha channel information to the YUV channel data by the terminal;
the server extracts the Alpha channel information from the coded data;
the server decodes the YUV channel data to obtain RGB format data;
the server displays an image of a target object on a display according to the RGB format data and the Alpha information;
wherein the terminal appending the Alpha channel information to the YUV channel data includes:
dividing the Alpha channel information into 3 parts, and respectively adding the 3 parts to the upper end position of the Y channel data, the upper end position of the U channel data and the upper end position of the V channel data, so that a background image displayed on the server is displayed at the upper end position of a foreground image;
or dividing the Alpha channel information into 3 parts, and respectively adding the 3 parts to the lower end position of the Y channel data, the lower end position of the U channel data and the lower end position of the V channel data, so that a background image displayed on the server is displayed at the lower end position of a foreground image;
or dividing the Alpha channel information into 3 parts, and adding the 3 parts to the left position of the Y channel data, the left position of the U channel data and the left position of the V channel data respectively, so that the background image displayed on the server is displayed at the left position of the foreground image;
or dividing the Alpha channel information into 3 parts, and adding the 3 parts to the right position of the Y channel data, the right position of the U channel data and the right position of the V channel data respectively, so that the background image displayed on the server is displayed at the right position of the foreground image.
4. The method according to claim 3, wherein the server displays an image of a target object on a display according to the RGB format data and Alpha information, comprising:
the server fills the RGB format data according to the Alpha information to acquire display data;
and the server displays the image of the target object on the display according to the display data.
5. A terminal, comprising:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring image data of a target object, and the image data comprises RGB format data and Alpha channel information;
the encoding module is used for encoding the RGB format data to obtain YUV channel data;
an adding module, configured to add the Alpha channel information to the YUV channel data to obtain encoded data;
a sending module, configured to send the encoded data to a server, so that the server extracts the Alpha channel information from the encoded data, and decodes the YUV channel data to obtain RGB format data, where the encoded data is used for the server to decode and display an image of the target object;
the adding module is specifically configured to:
dividing the Alpha channel information into 3 parts, and respectively adding the 3 parts to the upper end position of the Y channel data, the upper end position of the U channel data and the upper end position of the V channel data, so that a background image displayed on the server is displayed at the upper end position of a foreground image;
or dividing the Alpha channel information into 3 parts, and respectively adding the 3 parts to the lower end position of the Y channel data, the lower end position of the U channel data and the lower end position of the V channel data, so that a background image displayed on the server is displayed at the lower end position of a foreground image;
or dividing the Alpha channel information into 3 parts, and adding the 3 parts to the left position of the Y channel data, the left position of the U channel data and the left position of the V channel data respectively, so that the background image displayed on the server is displayed at the left position of the foreground image;
or dividing the Alpha channel information into 3 parts, and adding the 3 parts to the right position of the Y channel data, the right position of the U channel data and the right position of the V channel data respectively, so that the background image displayed on the server is displayed at the right position of the foreground image.
6. The terminal according to claim 5, wherein the background image corresponds to the Alpha channel information, and the foreground image corresponds to the YUV channel data.
7. A server, comprising:
the receiving module is used for receiving coded data sent by a terminal, wherein the coded data comprises YUV channel data and Alpha channel information; the coded data is obtained by adding the Alpha channel information to the YUV channel data by the terminal;
the decoding module is used for extracting the Alpha channel information from the coded data and decoding the YUV channel data to obtain RGB format data;
the display module is used for displaying an image of a target object according to the RGB format data and the Alpha information;
wherein the terminal appending the Alpha channel information to the YUV channel data includes:
dividing the Alpha channel information into 3 parts, and respectively adding the 3 parts to the upper end position of the Y channel data, the upper end position of the U channel data and the upper end position of the V channel data, so that a background image displayed on the server is displayed at the upper end position of a foreground image;
or dividing the Alpha channel information into 3 parts, and respectively adding the 3 parts to the lower end position of the Y channel data, the lower end position of the U channel data and the lower end position of the V channel data, so that a background image displayed on the server is displayed at the lower end position of a foreground image;
or dividing the Alpha channel information into 3 parts, and adding the 3 parts to the left position of the Y channel data, the left position of the U channel data and the left position of the V channel data respectively, so that the background image displayed on the server is displayed at the left position of the foreground image;
or dividing the Alpha channel information into 3 parts, and adding the 3 parts to the right position of the Y channel data, the right position of the U channel data and the right position of the V channel data respectively, so that the background image displayed on the server is displayed at the right position of the foreground image.
8. The server according to claim 7, comprising:
the display module is specifically configured to fill the RGB format data according to the Alpha information to obtain display data; and displaying the image of the target object according to the display data.
CN201710330725.0A 2017-05-11 2017-05-11 Method, terminal and server for transmitting video information Active CN108881920B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710330725.0A CN108881920B (en) 2017-05-11 2017-05-11 Method, terminal and server for transmitting video information
PCT/CN2018/085445 WO2018205878A1 (en) 2017-05-11 2018-05-03 Method for transmitting video information, terminal, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710330725.0A CN108881920B (en) 2017-05-11 2017-05-11 Method, terminal and server for transmitting video information

Publications (2)

Publication Number Publication Date
CN108881920A CN108881920A (en) 2018-11-23
CN108881920B true CN108881920B (en) 2022-06-10

Family

ID=64105231

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710330725.0A Active CN108881920B (en) 2017-05-11 2017-05-11 Method, terminal and server for transmitting video information

Country Status (2)

Country Link
CN (1) CN108881920B (en)
WO (1) WO2018205878A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021237569A1 (en) * 2020-05-28 2021-12-02 深圳市大疆创新科技有限公司 Encoding method, decoding method, apparatus and system
CN112402958B (en) * 2020-10-27 2022-05-13 腾讯科技(深圳)有限公司 Image processing method, device and storage medium
CN112601096B (en) * 2020-12-02 2023-11-14 深圳力维智联技术有限公司 Video decoding method, device, equipment and readable storage medium
CN114173157B (en) * 2021-12-10 2022-12-16 广州博冠信息科技有限公司 Video stream transmission method and device, electronic equipment and storage medium
CN114928730B (en) * 2022-06-23 2023-08-22 湖南国科微电子股份有限公司 Image processing method and image processing apparatus
CN115297312A (en) * 2022-08-04 2022-11-04 深圳市瑞云科技有限公司 Method for transmitting image transparency information
CN115834898B (en) * 2023-02-23 2023-05-12 成都索贝数码科技股份有限公司 Transmission method for carrying alpha channel value during HDMI transmission

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000165683A (en) * 1998-09-21 2000-06-16 Fuji Photo Film Co Ltd Image ciphering device, image ciphering method, recording medium in which image ciphering process is recorded and recording medium in which image file is recorded
CN101742317A (en) * 2009-12-31 2010-06-16 北京中科大洋科技发展股份有限公司 Video compressing and encoding method with alpha transparent channel
CN102289829A (en) * 2011-07-12 2011-12-21 北京朗玛数联科技有限公司 Method and device for storing and restoring image and system for processing image
US8655086B1 (en) * 2011-11-01 2014-02-18 Zynga, Inc. Image compression with alpha channel data
CN104365098A (en) * 2012-06-01 2015-02-18 阿尔卡特朗讯公司 Method and apparatus for encoding a video stream
WO2016192060A1 (en) * 2015-06-03 2016-12-08 Intel Corporation Low power video composition using a stream out buffer
CN106375759A (en) * 2016-08-31 2017-02-01 深圳超多维科技有限公司 Video image data coding method and device, and video image data decoding method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102036059A (en) * 2009-09-25 2011-04-27 腾讯科技(深圳)有限公司 Method, device and system for compressing and decompressing transparent images
US20150074735A1 (en) * 2013-09-06 2015-03-12 Seespace Ltd. Method and Apparatus for Rendering Video Content Including Secondary Digital Content
CN105306941B (en) * 2015-11-12 2019-05-24 成都图影视讯科技有限公司 A kind of method for video coding

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000165683A (en) * 1998-09-21 2000-06-16 Fuji Photo Film Co Ltd Image ciphering device, image ciphering method, recording medium in which image ciphering process is recorded and recording medium in which image file is recorded
CN101742317A (en) * 2009-12-31 2010-06-16 北京中科大洋科技发展股份有限公司 Video compressing and encoding method with alpha transparent channel
CN102289829A (en) * 2011-07-12 2011-12-21 北京朗玛数联科技有限公司 Method and device for storing and restoring image and system for processing image
US8655086B1 (en) * 2011-11-01 2014-02-18 Zynga, Inc. Image compression with alpha channel data
CN104365098A (en) * 2012-06-01 2015-02-18 阿尔卡特朗讯公司 Method and apparatus for encoding a video stream
WO2016192060A1 (en) * 2015-06-03 2016-12-08 Intel Corporation Low power video composition using a stream out buffer
CN106375759A (en) * 2016-08-31 2017-02-01 深圳超多维科技有限公司 Video image data coding method and device, and video image data decoding method and device

Also Published As

Publication number Publication date
WO2018205878A1 (en) 2018-11-15
CN108881920A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN108881920B (en) Method, terminal and server for transmitting video information
CN109218731B (en) Screen projection method, device and system of mobile equipment
CN106412691B (en) Video image intercepting method and device
CN110636375B (en) Video stream processing method and device, terminal equipment and computer readable storage medium
TW201903713A (en) Image processing method, device, computer readable storage medium and mobile terminal
CN111544886A (en) Picture display method and related device
CN107731199B (en) Screen color temperature adjusting method, terminal and computer readable storage medium
CN107846554B (en) Image processing method, terminal and computer readable storage medium
CN110908765A (en) Interface display method and device, terminal equipment and storage medium
CN106844580B (en) Thumbnail generation method and device and mobile terminal
CN109168013B (en) Method, device and equipment for extracting frame and computer readable storage medium
CN106993136B (en) Mobile terminal and multi-camera-based image noise reduction method and device thereof
CN112312144B (en) Live broadcast method, device, equipment and storage medium
CN110917610A (en) Method for assisting game display effect, mobile terminal and computer-readable storage medium
CN108460769B (en) image processing method and terminal equipment
KR100630203B1 (en) Method for displaying of picture in wireless terminal
US10515613B2 (en) Method of processing images and apparatus
CN109474833B (en) Network live broadcast method, related device and system
CN111935509A (en) Multimedia data playing method, related device, equipment and storage medium
CN107729736B (en) Face recognition method and related product
CN107292833B (en) Image processing method and device and mobile terminal
KR100650251B1 (en) Handset having image processing function and method therefor
CN106375787B (en) Video playing method and device
US11109008B2 (en) Data transmission method, data loading device, data parsing device and data transmission system
CN113630621B (en) Video processing method, related device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant