CN111556317A - Coding method, device and coding and decoding system - Google Patents

Coding method, device and coding and decoding system Download PDF

Info

Publication number
CN111556317A
CN111556317A CN202010327178.2A CN202010327178A CN111556317A CN 111556317 A CN111556317 A CN 111556317A CN 202010327178 A CN202010327178 A CN 202010327178A CN 111556317 A CN111556317 A CN 111556317A
Authority
CN
China
Prior art keywords
color
macro block
coding
frame image
current frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010327178.2A
Other languages
Chinese (zh)
Inventor
张文强
范志刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Wanxiang Electronics Technology Co Ltd
Original Assignee
Xian Wanxiang Electronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Wanxiang Electronics Technology Co Ltd filed Critical Xian Wanxiang Electronics Technology Co Ltd
Priority to CN202010327178.2A priority Critical patent/CN111556317A/en
Publication of CN111556317A publication Critical patent/CN111556317A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component

Abstract

The present disclosure provides a coding method, a coding device and a coding and decoding system, which relate to the field of computer coding, wherein the coding method comprises: acquiring a current frame image, wherein the current frame image is divided into a plurality of macro blocks, each macro block comprises M × N pixels, and M, N is a positive integer; extracting the background color of the current frame image and generating background color information; judging whether the value of the pixel point in each macro block is consistent with the background color information; identifying a first type macro block according to a judgment result and a preset rule; and coding the first type macro block according to a first coding mode to obtain first coded data. The embodiment provided by the disclosure can adapt to the characteristics of the desktop image and can effectively reduce the bandwidth.

Description

Coding method, device and coding and decoding system
Technical Field
The present disclosure relates to the field of computer coding, and in particular, to a coding method, a coding apparatus, and a coding/decoding system apparatus.
Background
The cloud desktop technology or desktop virtualization refers to a technology in which a user remotely accesses a server, and the server provides a desktop image of a current server host to a remote client in a virtual manner. By means of this technique, the user operates the same computer as he or she does. The server transmits the images to the user through screen copy, and the user obtains the desktop image sequence. After the image is compressed, the image is transmitted to a cloud terminal through a network, and the terminal decodes the image to obtain desktop image content.
Unlike natural images, desktop images have their own features. The change in gray scale and color of a natural image is generally continuous. Desktop images are complex in composition, usually containing text and graphics information, and often the variation of pixel values is discontinuous.
The general image processing standard is coded based on the strong correlation between the space domain and the time domain of the natural image, but the coding mode can not match the desktop image characteristics very well.
Disclosure of Invention
The embodiment of the disclosure provides an encoding method, an encoding device and an encoding and decoding system, which can adapt to the characteristics of desktop images and can effectively reduce bandwidth. The technical scheme is as follows:
according to a first aspect of the embodiments of the present disclosure, there is provided an encoding method, including:
acquiring a current frame image, wherein the current frame image is divided into a plurality of macro blocks, each macro block comprises M × N pixels, and M, N is a positive integer;
extracting the background color of the current frame image and generating background color information;
judging whether the value of the pixel point in each macro block is consistent with the background color information;
identifying a first type macro block according to a judgment result and a preset rule;
and coding the first type macro block according to a first coding mode to obtain first coded data.
In one embodiment, extracting a background color of the current frame image and generating the background color information includes:
identifying all pure color macro blocks in the current frame image, wherein all the pure color macro blocks at least comprise at least one type of pure color macro blocks;
counting the number of at least one type of pure color macro blocks in the pure color macro blocks, and determining the color of the one or more types of pure color macro blocks with the largest number as the background color;
the background color information is generated based on the background color.
In one embodiment, identifying all solid color macroblocks in the current frame image comprises:
extracting an edge area of the current frame image;
all solid color macroblocks in the edge region are identified.
In one embodiment, identifying all solid color macroblocks in the current frame image comprises:
acquiring a current macro block;
scanning pixel points in the current macro block line by line or line by line;
judging whether the colors of all pixel points in the current macro block are completely the same;
if the two macro blocks are completely the same, the current macro block is determined to be a pure color macro block, and the color of the pixel point is the color of the pure color macro block.
In one embodiment, the identifying the first type macro block according to the judgment result and the preset rule comprises: acquiring the color of each pure color macro block;
judging whether the color of each pure color macro block is the same as the background color;
and determining the pure color macro block with the same color as the background color as the first type macro block.
In one embodiment, encoding the first type of macroblock in the first encoding mode includes:
adding mark information to the first type macro block; the flag information is used to flag a background color that is the same as the color of the first type macro block.
In one embodiment, the method further comprises:
and coding the second type macro blocks except the first type macro blocks in the current frame image according to a second coding mode to obtain second coded data.
According to a second aspect of the embodiments of the present disclosure, there is provided an encoding apparatus, the apparatus including:
an obtaining module, configured to obtain a current frame image, where the current frame image is divided into multiple macroblocks, each macroblock includes M × N pixels, and M, N is a positive integer;
the extraction module is used for extracting the background color of the current frame image and generating background color information;
the judging module is used for judging whether the value of the pixel point in each macro block is consistent with the background color information;
the identification module is used for identifying the first type macro block according to the judgment result and a preset rule;
and the first coding module is used for coding the first type macro block according to a first coding mode to obtain first coded data.
In one embodiment, the apparatus further comprises:
and the second coding module is used for coding the second type macro blocks except the first type macro blocks in the current frame image according to a second coding mode to obtain second coded data.
According to a third aspect of the embodiments of the present disclosure, there is provided a coding and decoding system, the system comprising the coding apparatus and the decoding apparatus described in the second aspect, the decoding apparatus is configured to: receiving first coded data of a first coding module and second coded data of a second coding module;
the first encoded data is decoded in accordance with a first decoding method corresponding to the first encoding method, and the second encoded data is decoded in accordance with a second decoding method corresponding to the second encoding method.
The embodiment provided by the disclosure can encode the desktop image according to the background color, and the encoding method can be well adapted to the characteristics of the desktop image and can effectively reduce the bandwidth.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic illustration of a realistic interface provided by embodiments of the present disclosure;
FIG. 2 is a schematic diagram of an application environment provided by an embodiment of the present disclosure;
fig. 3 is a flowchart of an encoding method provided by the embodiment of the present disclosure;
fig. 4 is a structural diagram of an encoding apparatus provided in an embodiment of the present disclosure;
fig. 5 is a structural diagram of a coding and decoding system according to an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Some portions of the following description are explicitly or implicitly presented in terms of algorithms and functional or symbolic representations of operations on data within a computer memory. These algorithmic descriptions and functional or symbolic representations are the means used by those skilled in the data processing arts to more effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. These steps are those requiring physical manipulations of physical quantities such as electrical, magnetic, or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated.
Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "encoding," "generating," "extracting," "sending," "obtaining," "identifying," or the like, refer to the action and processes of a computer system, or similar electronic device, that manipulates and transforms data represented as physical quantities within the computer system into other data similarly represented as physical quantities within the computer system or other information storage, transmission or display devices.
The specification also discloses apparatus for performing the method operations. Such apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer or other device selectively activated or reconfigured by a computer program stored in the computer. The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose machines may be used with programs in accordance with the teachings herein. Alternatively, more specific apparatus configurations for performing the required method steps may be suitable. The structure of a conventional general-purpose computer will be described in the following description.
Further, the present specification also implicitly discloses computer programs, as it will be apparent to the skilled person that the steps of the methods described herein can be implemented by computer code. The computer program is not intended to be limited to any particular programming language and implementation thereof. It will be appreciated that a variety of programming languages and code therefor may be used to implement the teachings of the disclosure contained herein. Further, the computer program is not intended to be limited to any particular control flow. There are many other kinds of computer programs that may use different control flows without departing from the spirit or scope of the present invention.
Also, one or more steps of a computer program may be executed in parallel rather than sequentially. Such a computer program may be stored on any computer readable medium. The computer readable medium may include a storage device such as a magnetic or optical disk, memory chip or other storage device suitable for interfacing with a general purpose computer, and the like. The computer readable medium may also include a hard-wired medium such as in an internet system, or a wireless medium. When the computer program is loaded into and executed by such a general-purpose computer, the computer program effectively creates an apparatus for implementing the steps of the preferred method.
The invention may also be implemented as hardware modules. More specifically, in a hardware sense, a module is a functional hardware unit designed for use with other components or modules. For example, a module may be implemented using discrete electronic components, or it may form part of an overall electronic circuit, such as an Application Specific Integrated Circuit (ASIC). Many other possibilities exist. Those skilled in the art will appreciate that the system may also be implemented as a combination of hardware and software modules.
Fig. 1 is a schematic view of a real interface provided by an embodiment of the present disclosure, where transmission of a desktop image is mostly applied to an office scene, in the office scene, a user often operates various office software or browses a web page, for example, the user is likely to operate a document file for a long time, and a generated desktop image includes a large background image with a single color, and fig. 1 is a schematic view of a display interface when viewing a PDF document. Based on this, the present invention provides an image coding scheme in accordance with this feature of the desktop image.
Fig. 2 is a schematic diagram of an application environment provided by the embodiment of the present disclosure, and as shown in fig. 2, a video signal is encoded in an encoding end and then transmitted to a decoding end through a network transmission channel. As can be understood by those skilled in the art, the encoding end is located at the server end; the decoding end is located above the receiving device, in a cloud desktop scene, the receiving device can be a personal computer, a mobile phone and the like, and in a desktop virtualization scene, the receiving device can be a zero terminal. The number of receiving devices may be one or more, and the present invention is not limited thereto.
The main technical steps of the encoding end comprise:
step 101, acquiring a current frame image;
in this step, a current frame image is acquired by an image acquisition device.
Step 102, dividing a current frame image into a plurality of macro blocks;
specifically, the current frame image is divided into a plurality of macroblocks according to a preset macroblock division mode, wherein the size of each macroblock is M × N.
103, extracting the background color of the current frame image and generating background color information;
specifically, the extracting the background color of the current frame image includes:
s20, finding out pure color macro blocks in all macro blocks in the current frame image;
and S21, confirming one or more colors which are most appeared in all the pure color macro blocks as background colors.
Preferably, the background color of the current frame image may also be extracted by:
s30, identifying the edge area of the current frame image;
s31, finding out all pure color macro blocks from the edge area;
and S32, confirming one or more colors which appear most in all the pure color macro blocks as background colors.
Specifically, the specific way to find the pure color macro block is as follows:
s40, scanning the pixel points in the current macro block line by line or line by line;
s41, recording the color of the first pixel point;
s42, if a point with a color different from that of the recorded first pixel point is scanned, ending the scanning of the current macro block, and confirming that the current macro block is a non-pure color macro block; and if all the pixel points are scanned, and no pixel point with the color different from that of the first pixel point is found, determining that the current macro block is a pure color macro block, and finishing scanning the current macro block.
The generation of the background color information refers to recording information of one or more currently determined background colors in background color information bits in the coded data packet header. The background color information may be represented by RGB color values of a background color.
For example, assume that all color RGB values present in a pure color macroblock include: 0. 255, 108, 160, wherein 0 appears 1500 times, 255 appears 52 times, 108 appears 25 times, and 160 appears 30 times, then the color with RGB equal to 0 can be determined as the background color; in the same case, assume that 0 occurs 1500 times and 255 occurs 1420 times
Once 108 and 160 occur 50 times, 42 times, the two colors RGB 0 and RGB 255 can be determined as the background color.
Step 104, identifying all the macro blocks with the pixel points being background colors, and determining the identified macro blocks as target macro blocks;
in this step, each macroblock needs to be detected one by one.
Specifically, the detection method for each macroblock is as follows:
s50, scanning each pixel point row by row or column by column;
s51, judging whether the pixel value of the scanned pixel point is the same as the background color;
s52, if different pixel points appear, ending the scanning of the current macro block and confirming that the current macro block is not the target macro block; and if all the pixel points are scanned and have the same background color, determining that the current macro block is the target macro block.
105, coding all target macro blocks according to the background color;
specifically, the encoding all the target macroblocks according to the background color includes:
and taking the identified macro block as a target macro block, and adding mark information to the target macro block.
Specifically, the marking information is used for marking that the color of all pixel points in the current target macro block is the same as the background color.
And if a plurality of background colors exist, the marking information is used for marking that the color of all pixel points in the current target macro block is the same as the specific background color.
Further, the method also comprises the following steps:
step 106, coding other macro blocks except the target macro block according to the existing mode;
specifically, the existing method includes any method capable of encoding a single macroblock, such as JPEG.
And step 107, sending the coded current frame image to a decoding end.
The main technical steps of the decoding end comprise:
step 201, receiving a coded image sent by a coding end;
step 202, decoding the received coded image;
specifically, the decoding the received encoded image includes:
and respectively decoding the coded data of the target macro block and other macro blocks.
Specifically, the decoding the target macroblock includes:
s70, extracting background color data from the current frame data;
s71, determining the background color corresponding to the corresponding target macro block according to the mark information corresponding to each target macro block;
and S72, generating each pixel point in the target macro block according to the corresponding background color, wherein the color of each pixel point is the corresponding background color.
The decoding the encoded data of the other macroblocks includes:
and decoding the other macroblocks by adopting a decoding algorithm corresponding to the coding algorithm of the other macroblocks.
For example, if other macroblocks are encoded by JPEG, each macroblock is decoded according to the decoding algorithm of JPGE.
The embodiment of the disclosure provides an encoding method, an encoding device and an encoding and decoding system, which can adapt to the characteristics of desktop images and can effectively reduce bandwidth.
Fig. 3 is an encoding method provided in an embodiment of the present disclosure, where the encoding method shown in fig. 3 includes the following steps:
step 301, obtaining a current frame image, where the current frame image is divided into a plurality of macroblocks, each macroblock includes M × N pixels, and M, N is a positive integer;
step 302, extracting the background color of the current frame image and generating background color information;
optionally, step 302 includes:
step 3021, identifying all pure color macro blocks in the current frame image, wherein all the pure color macro blocks at least comprise at least one type of pure color macro block;
step 3022, counting the number of at least one type of pure color macro blocks in the pure color macro blocks, and determining the color of the one or more types of pure color macro blocks with the largest number as the background color;
and step 3023, generating background color information based on the background color.
Optionally, step 3021 specifically includes:
step 30211, extracting an edge area of the current frame image;
step 30212, identify all solid color macroblocks in the border area.
Optionally, step 3021 specifically includes:
step 3021a, acquiring a current macro block;
step 3021b, scanning the pixel points in the current macro block line by line or line by line;
step 3021c, judging whether the colors of all the pixel points in the current macro block are completely the same;
and step 3021d, if the color of the pixel is identical to that of the pure color macro block, determining that the current macro block is the pure color macro block, and the color of the pixel is the color of the pure color macro block.
Step 303, judging whether the value of the pixel point in each macro block is consistent with the background color information;
step 304, identifying a first type macro block according to the judgment result and a preset rule;
optionally, step 304 includes:
step 3041, obtaining the color of each pure color macro block;
step 3042, determining whether the color of each pure color macro block is the same as the background color;
step 3043, determine the pure color macro block with the same color as the background color as the first type macro block.
Step 305, the first type macro block is coded according to a first coding mode to obtain first coded data.
Specifically, marking information is added to the first type macro block; the flag information is used to flag a background color that is the same as the color of the first type macro block.
In one embodiment, the method further comprises:
and step 306, coding the second type macro block except the first type macro block in the current frame image according to a second coding mode to obtain second coded data.
Fig. 4 is an encoding apparatus provided in an embodiment of the present disclosure, where the encoding apparatus 40 shown in fig. 4 includes:
an obtaining module 401, configured to obtain a current frame image, where the current frame image is divided into a plurality of macroblocks, each macroblock includes M × N pixels, where M, N is a positive integer;
an extracting module 402, configured to extract a background color of the current frame image and generate background color information;
optionally, the extracting module 402 includes:
the identifying sub-module 4021 is configured to identify all the pure color macro blocks in the current frame image, where all the pure color macro blocks at least include at least one type of pure color macro block;
the counting submodule 4022 is configured to count the number of at least one type of pure color macro block in the pure color macro blocks, and determine the color of the one or more types of pure color macro blocks with the largest number as a background color;
the generating sub-module 4023 is configured to generate background color information based on the background color.
Optionally, the identifier module 4021 specifically includes:
an extracting unit 40211, configured to extract an edge region of the current frame image;
the identifying unit 40212 is configured to identify all the pure color macroblocks in the edge region.
Optionally, the identifier module 4021 specifically includes:
an obtaining unit 4021a, configured to obtain a current macroblock;
the scanning unit 4021b is configured to scan pixel points in a current macroblock line by line or column by column;
the determining unit 4021c is configured to determine whether colors of all pixel points in the current macroblock are completely the same;
the determining unit 4021d is configured to determine that the current macroblock is a pure color macroblock and the color of the pixel point is the color of the pure color macroblock if the current macroblock is the same as the pure color macroblock.
A judging module 403, configured to judge whether a value of a pixel in each macroblock is consistent with background color information;
an identifying module 404, configured to identify a first type macro block according to the determination result and a preset rule;
optionally, the identification module 404 includes:
an obtaining sub-module 4041, configured to obtain a color of each pure color macroblock;
the determining submodule 4042 is configured to determine whether the color of each pure-color macroblock is the same as the background color;
the determining sub-module 4043 is configured to determine the pure color macroblock with the same color as the background color as the first type macroblock.
The first encoding module 405 is configured to encode the first type macroblock according to a first encoding method to obtain first encoded data.
Specifically, marking information is added to the first type macro block; the flag information is used to flag a background color that is the same as the color of the first type macro block.
In one embodiment, the apparatus further comprises:
the second encoding module 406 encodes the second type macro block except the first type macro block in the current frame image according to a second encoding mode to obtain second encoded data.
According to a third aspect of the embodiments of the present disclosure, there is provided a coding and decoding system, the system comprising the coding apparatus and the decoding apparatus described in the second aspect, the decoding apparatus is configured to: receiving first coded data of a first coding module and second coded data of a second coding module;
the first encoded data is decoded in accordance with a first decoding method corresponding to the first encoding method, and the second encoded data is decoded in accordance with a second decoding method corresponding to the second encoding method.
Fig. 5 is a coding/decoding system according to an embodiment of the disclosure, where the coding/decoding system 50 shown in fig. 5 includes: the encoding apparatus 501 and the decoding apparatus 502 in the above embodiments, the decoding apparatus 502 is configured to:
receiving first coded data of a first coding module and second coded data of a second coding module;
the first encoded data is decoded in accordance with a first decoding method corresponding to the first encoding method, and the second encoded data is decoded in accordance with a second decoding method corresponding to the second encoding method.
Based on the encoding method described in the embodiment corresponding to fig. 3, an embodiment of the present disclosure further provides a computer-readable storage medium, for example, the non-transitory computer-readable storage medium may be a Read Only Memory (ROM), a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. The storage medium stores computer instructions for executing the data transmission method described in the embodiment corresponding to fig. 3, which is not described herein again.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. An image encoding method, characterized in that the method comprises:
acquiring a current frame image, wherein the current frame image is divided into a plurality of macro blocks, each macro block comprises M × N pixels, and M, N is a positive integer;
extracting the background color of the current frame image and generating background color information;
judging whether the value of the pixel point in each macro block is consistent with the background color information;
identifying a first type macro block according to a judgment result and a preset rule;
and coding the first type macro block according to a first coding mode to obtain first coded data.
2. The method of claim 1, wherein the extracting the background color of the current frame image and generating the background color information comprises:
identifying all pure color macro blocks in the current frame image, wherein all the pure color macro blocks at least comprise at least one type of pure color macro blocks;
counting the number of at least one type of pure color macro blocks in the pure color macro blocks, and determining the color of the one or more types of pure color macro blocks with the largest number as the background color;
generating background color information based on the background color.
3. The method of claim 2, wherein the identifying all solid color macroblocks in the current frame image comprises:
extracting an edge area of the current frame image;
all solid color macroblocks in the edge region are identified.
4. The method of claim 2, wherein the identifying all solid color macroblocks in the current frame image comprises:
acquiring a current macro block;
scanning pixel points in the current macro block line by line or line by line;
judging whether the colors of all pixel points in the current macro block are completely the same;
and if the current macro block is identical to the pure macro block, determining that the current macro block is a pure macro block, and the color of the pixel point is the color of the pure macro block.
5. The method of claim 1, wherein the identifying the first type of macro block according to the judgment result and the preset rule comprises: acquiring the color of each pure color macro block;
judging whether the color of each pure color macro block is the same as the background color;
and determining the pure color macro block with the same color as the background color as the first type macro block.
6. The method according to any of claims 1-5, wherein said encoding said first type of macroblock in a first encoding mode comprises:
adding mark information to the first type macro block; the flag information is used to flag a background color that is the same as the color of the first type macroblock.
7. The method of claim 1, further comprising:
and coding the second type macro blocks except the first type macro blocks in the current frame image according to a second coding mode to obtain second coded data.
8. An encoding apparatus, characterized in that the apparatus comprises:
an obtaining module, configured to obtain a current frame image, where the current frame image is divided into multiple macroblocks, each macroblock includes M × N pixels, and M, N is a positive integer;
the extraction module is used for extracting the background color of the current frame image and generating background color information;
the judging module is used for judging whether the value of the pixel point in each macro block is consistent with the background color information;
the identification module is used for identifying the first type macro block according to the judgment result and a preset rule;
and the first coding module is used for coding the first type macro block according to a first coding mode to obtain first coded data.
9. The apparatus of claim 8, further comprising:
and the second coding module is used for coding the second type macro blocks except the first type macro blocks in the current frame image according to a second coding mode to obtain second coded data.
10. A coding/decoding system comprising the encoding apparatus of claim 8 or 9 and a decoding apparatus, the decoding apparatus being configured to: receiving first coded data of a first coding module and second coded data of a second coding module;
the first encoded data is decoded in accordance with a first decoding method corresponding to the first encoding method, and the second encoded data is decoded in accordance with a second decoding method corresponding to the second encoding method.
CN202010327178.2A 2020-04-23 2020-04-23 Coding method, device and coding and decoding system Pending CN111556317A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010327178.2A CN111556317A (en) 2020-04-23 2020-04-23 Coding method, device and coding and decoding system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010327178.2A CN111556317A (en) 2020-04-23 2020-04-23 Coding method, device and coding and decoding system

Publications (1)

Publication Number Publication Date
CN111556317A true CN111556317A (en) 2020-08-18

Family

ID=72003916

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010327178.2A Pending CN111556317A (en) 2020-04-23 2020-04-23 Coding method, device and coding and decoding system

Country Status (1)

Country Link
CN (1) CN111556317A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013062690A (en) * 2011-09-13 2013-04-04 Fujitsu Ltd Image compression method, image compression device and system
CN106780306A (en) * 2016-12-09 2017-05-31 腾讯音乐娱乐(深圳)有限公司 One kind reconstruct original text generation method and device
CN107483934A (en) * 2017-08-17 2017-12-15 西安万像电子科技有限公司 Decoding method, device and system
CN110446041A (en) * 2018-05-02 2019-11-12 中兴通讯股份有限公司 A kind of video coding-decoding method, device, system and storage medium
CN110545432A (en) * 2018-05-28 2019-12-06 深信服科技股份有限公司 image encoding and decoding methods, related devices and storage medium
CN110545417A (en) * 2018-05-28 2019-12-06 深信服科技股份有限公司 image coding and decoding method for desktop scene and related device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013062690A (en) * 2011-09-13 2013-04-04 Fujitsu Ltd Image compression method, image compression device and system
CN106780306A (en) * 2016-12-09 2017-05-31 腾讯音乐娱乐(深圳)有限公司 One kind reconstruct original text generation method and device
CN107483934A (en) * 2017-08-17 2017-12-15 西安万像电子科技有限公司 Decoding method, device and system
CN110446041A (en) * 2018-05-02 2019-11-12 中兴通讯股份有限公司 A kind of video coding-decoding method, device, system and storage medium
CN110545432A (en) * 2018-05-28 2019-12-06 深信服科技股份有限公司 image encoding and decoding methods, related devices and storage medium
CN110545417A (en) * 2018-05-28 2019-12-06 深信服科技股份有限公司 image coding and decoding method for desktop scene and related device

Similar Documents

Publication Publication Date Title
US10469701B2 (en) Image processing method that obtains special data from an external apparatus based on information multiplexed in image data and apparatus therefor
CN108933935B (en) Detection method and device of video communication system, storage medium and computer equipment
US10129385B2 (en) Method and apparatus for generating and playing animated message
KR100669837B1 (en) Extraction of foreground information for stereoscopic video coding
CN109040792B (en) Processing method for video redirection, cloud terminal and cloud desktop server
CN109640089B (en) Image coding and decoding method and device
CN102244783B (en) Method and system for data processing
CN110139104B (en) Video decoding method, video decoding device, computer equipment and storage medium
CN109862365B (en) Image data processing method and device
US10290110B2 (en) Video overlay modification for enhanced readability
CN110022481B (en) Decoding and generating methods and devices of video code stream, storage medium and electronic device
JP5950605B2 (en) Image processing system and image processing method
CN106791829B (en) Method and equipment for establishing virtual reference frame
CN113469869B (en) Image management method and device
CN103686056A (en) Conference terminal and video processing method for conference terminal
CN110636334B (en) Data transmission method and system
CN111510643A (en) System and method for splicing panoramic image and close-up image
CN110741635A (en) Encoding method, decoding method, encoding device, and decoding device
CN108924624B (en) Information processing method and device
CN111556317A (en) Coding method, device and coding and decoding system
CN111353133B (en) Image processing method, device and readable storage medium
CN113596449B (en) Image processing method and device
CN111866514B (en) Method and device for compressing video and decompressing video
JP2014506036A (en) Video stream display system and protocol
CN116567229A (en) Image processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination