CN112351282A - Image data transmission method and device, nonvolatile storage medium and processor - Google Patents
Image data transmission method and device, nonvolatile storage medium and processor Download PDFInfo
- Publication number
- CN112351282A CN112351282A CN202011173524.2A CN202011173524A CN112351282A CN 112351282 A CN112351282 A CN 112351282A CN 202011173524 A CN202011173524 A CN 202011173524A CN 112351282 A CN112351282 A CN 112351282A
- Authority
- CN
- China
- Prior art keywords
- image data
- transmitted
- layers
- original image
- code stream
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 230000005540 biological transmission Effects 0.000 title claims description 37
- 238000012545 processing Methods 0.000 claims description 37
- 238000004590 computer program Methods 0.000 claims description 7
- 230000001131 transforming effect Effects 0.000 claims description 3
- 230000000750 progressive effect Effects 0.000 abstract description 21
- 230000008569 process Effects 0.000 abstract description 15
- 238000005516 engineering process Methods 0.000 abstract description 6
- 238000013139 quantization Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 230000009466 transformation Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/176—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/124—Quantisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/167—Position within a video image, e.g. region of interest [ROI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/60—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
- H04N19/625—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using discrete cosine transform [DCT]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/90—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
- H04N19/91—Entropy coding, e.g. variable length coding [VLC] or arithmetic coding
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Discrete Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
The invention discloses a method and a device for transmitting image data, a nonvolatile storage medium and a processor. Wherein, the method comprises the following steps: acquiring original image data of a current frame, wherein the original image data comprises: a plurality of macroblocks; acquiring a position to be focused in the original image data; determining the number of layers to be transmitted of each macro block in the plurality of macro blocks based on the position to be concerned; and generating a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the plurality of macro blocks, and transmitting the code stream to be transmitted to a decoding end. The invention solves the technical problem that the definition of the user interest area in the image data can not be ensured in the process of transmitting the image data by the progressive coding and decoding scheme in the related technology.
Description
Technical Field
The present invention relates to the field of image processing, and in particular, to a method and an apparatus for transmitting image data, a non-volatile storage medium, and a processor.
Background
In the related art, a progressive encoding and decoding method for image data (especially for computer synthesis images) is to layer full-frame macroblocks during encoding, and determine how many layers each macroblock is transmitted from the number of the layer and the total number of layers are transmitted according to the current bandwidth condition and the macroblock change condition of the current frame.
However, the above processing method can only realize that the image is not jammed in the low bandwidth condition, and is acceptable for watching videos with low requirements on definition, because the image has a fuzzy region randomly appearing in the progressive process from fuzzy to clear, and for some scenes (such as office scenes) with a smaller display size of the key information that the user needs to view, if the user interest area is also blurred along with the full frame, the key information in the user interest area cannot be clearly identified, which greatly affects the user experience.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a transmission method and device of image data, a nonvolatile storage medium and a processor, which are used for at least solving the technical problem that the definition of a user interest area in the image data cannot be ensured in the process of transmitting the image data by a progressive coding and decoding scheme in the related technology.
According to an aspect of an embodiment of the present invention, there is provided a transmission method of image data, including: acquiring original image data of a current frame, wherein the original image data comprises: a plurality of macroblocks; acquiring a position to be focused in the original image data; determining the number of layers to be transmitted of each macro block in the plurality of macro blocks based on the position to be concerned; and generating a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the plurality of macro blocks, and transmitting the code stream to be transmitted to a decoding end.
Optionally, the obtaining of the position to be focused in the original image data includes: detecting whether a mouse cursor and/or a keyboard cursor continuously moves in a continuously played multi-frame historical image; when the mouse cursor continuously moves and the keyboard cursor does not move, determining the position to be focused on based on the moving range of the mouse cursor; and when the mouse cursor is not moved and the keyboard cursor is continuously moved, determining the position to be focused on based on the moving range of the keyboard cursor.
Optionally, determining the number of layers to be transmitted of each macroblock of the plurality of macroblocks based on the position to be focused includes: acquiring a first distance value between each macro block in the plurality of macro blocks and the macro block where the position to be noted is located; dividing the plurality of macroblocks into a plurality of regions according to the first distance values; and determining the number of layers to be transmitted of each macro block based on the area where each macro block is located.
Optionally, determining the number of layers to be transmitted of each macro block based on the area where each macro block is located includes: determining a second distance value between the area where each macro block is located and the position to be noted; allocating a corresponding bandwidth to be used for each area according to the size of the second distance value; and determining the number of layers to be transmitted of each macro block based on the bandwidth to be used.
Optionally, before acquiring the position to be focused in the original image data, the method further includes: comparing the original image data with reference frame image data to determine changed macro blocks; transforming and quantizing the changed macro blocks to obtain processing results; and storing the processing result to a data buffer area.
Optionally, the generating a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the plurality of macro blocks includes: acquiring data to be transmitted from the data buffer according to the number of layers to be transmitted of each macro block in the plurality of macro blocks; and coding the data to be transmitted to generate the code stream to be transmitted.
According to another aspect of the embodiments of the present invention, there is also provided an image data transmission method, including: receiving a code stream to be reconstructed from a coding end, wherein the code stream to be reconstructed is generated based on original image data of a current frame acquired by the coding end, the original image data comprises a plurality of macro blocks, and the number of layers to be transmitted of each macro block in the macro blocks is determined based on a position to be concerned in the original image data; and reconstructing the code stream to be reconstructed to generate image data to be played.
According to another aspect of the embodiments of the present invention, there is also provided an image data transmission apparatus including: an acquisition module, configured to acquire original image data of a current frame, where the original image data includes: a plurality of macroblocks; the acquisition module is used for acquiring a position to be focused in the original image data; a determining module, configured to determine, based on the to-be-focused position, a number of layers to be transmitted of each macroblock in the multiple macroblocks; and the processing module is used for generating a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the plurality of macro blocks and transmitting the code stream to be transmitted to a decoding end.
According to another aspect of the embodiments of the present invention, there is also provided an image data transmission apparatus including: the device comprises a receiving module, a judging module and a processing module, wherein the receiving module is used for receiving a code stream to be reconstructed from a coding end, the code stream to be reconstructed is generated based on original image data of a current frame acquired by the coding end, the original image data comprises a plurality of macro blocks, and the number of layers to be transmitted of each macro block in the macro blocks is determined based on a position to be concerned in the original image data; and the processing module is used for reconstructing the code stream to be reconstructed and generating image data to be played.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium storing a plurality of instructions, the instructions being adapted to be loaded by a processor and to execute any one of the above-mentioned image data transmission methods.
According to another aspect of the embodiments of the present invention, there is also provided a processor for executing a program, wherein the program is configured to execute any one of the above image data transmission methods when running.
According to another aspect of the embodiments of the present invention, there is also provided an electronic apparatus, including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform any one of the above-mentioned image data transmission methods.
In the embodiment of the present invention, original image data of a current frame is acquired, where the original image data includes: a plurality of macroblocks; acquiring a position to be focused in the original image data; determining the number of layers to be transmitted of each macro block in the plurality of macro blocks based on the position to be concerned; the method comprises the steps of generating a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the macro blocks, and transmitting the code stream to be transmitted to a decoding end, so that the purpose of ensuring the definition of a user interest area in the process of transmitting image data under a progressive coding and decoding scene is achieved, the technical effect of improving the processing efficiency of progressive coding and decoding of the image data is achieved, and the technical problem that the definition of the user interest area in the image data cannot be ensured in the process of transmitting the image data by a progressive coding and decoding scheme in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a flowchart of a transmission method of image data according to an embodiment of the present invention;
FIG. 2 is a block diagram illustrating components of an alternative codec system according to an embodiment of the present application;
FIG. 3 is a flow chart of an alternative method of image data transmission according to an embodiment of the present invention;
fig. 4 is a flowchart of another image data transmission method according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of an image data transmission apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an image data transmission apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of a method for transmitting image data, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than that herein.
Fig. 1 is a flowchart of a transmission method of image data according to an embodiment of the present invention, as shown in fig. 1, the method including the steps of:
step S102, acquiring original image data of a current frame, wherein the original image data includes: a plurality of macroblocks;
step S104, acquiring a position to be focused in the original image data;
step S106, determining the number of layers to be transmitted of each macro block in the plurality of macro blocks based on the position to be noted;
step S108, generating a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the plurality of macro blocks, and transmitting the code stream to be transmitted to a decoding end.
In the embodiment of the present invention, original image data of a current frame is acquired, where the original image data includes: a plurality of macroblocks; acquiring a position to be focused in the original image data; determining the number of layers to be transmitted of each macro block in the plurality of macro blocks based on the position to be concerned; the method comprises the steps of generating a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the macro blocks, and transmitting the code stream to be transmitted to a decoding end, so that the purpose of ensuring the definition of a user interest area in the process of transmitting image data under a progressive coding and decoding scene is achieved, the technical effect of improving the processing efficiency of progressive coding and decoding of the image data is achieved, and the technical problem that the definition of the user interest area in the image data cannot be ensured in the process of transmitting the image data by a progressive coding and decoding scheme in the related technology is solved.
Optionally, the original image data may be video image data, and the original image data includes: a plurality of macroblocks.
The method comprises the steps of introducing a concept of a user interest area on the basis of a current progressive scheme of layered transmission, namely acquiring a position to be concerned in original image data, and determining the number of layers to be transmitted of each macro block in a plurality of macro blocks based on the position to be concerned; under the same bandwidth limitation, transmitting macroblock layer data with large weight in a position to be noted in a limited way, and transmitting macroblocks in the position to be noted as completely as possible, wherein the distance value of each region from a region to be noted is sorted according to priority, and the closer the distance to the position to be noted is, the higher the priority level of the macroblock is, the larger bandwidth is allocated; the farther away from the location to be attended, the lower the priority level of the macroblock, and the less bandwidth is allocated.
In the embodiment of the application, a code stream to be transmitted is generated according to the number of layers to be transmitted of each macro block in the plurality of macro blocks, and the code stream to be transmitted is transmitted to a decoding end according to the bandwidth condition. The method and the device for processing the image data have the advantages of the existing scheme, can improve the definition of the interest area of the user under the condition that the bandwidth is limited, and are particularly suitable for office scenes with high requirements for the definition of the attention area.
As an alternative embodiment, the time stamp can be clearly distinguished by intercepting the raw image data (e.g., video pictures) before the image encoding process from a blur to clear progression as a contrasting uncoded lossless raw picture with a control small window shown in the picture. Since the full-frame image is mostly changed, the degree of blurring of the original image data is substantially uniform for the full frame, for example, the face of a person displayed in the original image data is also clear, and if the current user simply watches the video online, such a scene is acceptable to the user. However, when the original image data contains the display information (e.g., control tool button, time display information) which is small and critical, the critical display information cannot be clearly displayed, for example, the time stamp cannot be clearly shown. If the current user needs to perform operations such as image acquisition and image clipping according to the time stamp, the definition requirement of the user on the attention area can hardly be met in the situation.
Obtaining a position to be focused in original image data through the embodiment of the application; for example, when it is detected that the mouse of the user is located in a certain control area of a frame of original image data, it is determined that a preset radius area centered on the mouse is the area most interested by the user, i.e., the position to be focused on, and the priority of the transmission data layer of the position to be focused on is the highest, so that the timestamp and the button around the position to be focused on are displayed clearly, and the area farther from the position to be focused on is displayed in a blurred manner.
It should be noted that more similar situations may occur in a real office scene, and since the focus of attention of the user is usually at the position of the cursor when the user works normally, the application scheme adopts the position of the mouse or keyboard cursor as the center of the user interest area, which is in accordance with the actual situation of the office scene. Most operating systems support the acquisition of the position of the cursor, so that the consumption of computing resources for image recognition calculation for finding the cursor can be avoided.
In an alternative embodiment, acquiring the position to be focused in the raw image data includes:
step S202, detecting whether a mouse cursor and/or a keyboard cursor continuously moves in a continuously played multi-frame historical image;
step S204, when the mouse cursor continuously moves and the keyboard cursor does not move, determining the position to be focused on based on the moving range of the mouse cursor; and when the mouse cursor is not moved and the keyboard cursor is continuously moved, determining the position to be focused on based on the moving range of the keyboard cursor.
Optionally, if the mouse cursor and/or the keyboard cursor both move, the position to be focused on is determined according to the last moved cursor, or the position to be focused on is determined according to the movement of the mouse cursor by direct presetting, or the position to be focused on is determined according to the movement of the keyboard cursor by presetting.
Optionally, if neither the mouse cursor nor the keyboard cursor is moved, it indicates that the position to be focused does not need to be determined currently.
As an alternative embodiment, the implementation of progressive encoding is described first: during encoding, each macroblock is actually encoded separately, and each macroblock may be divided into a transform and quantization stage and an entropy encoding stage, in the present embodiment, the transform and quantization stage is implemented by the DCT transform and quantization module 403, and the entropy encoding stage is implemented by the entropy encoding module 409.
In which a macroblock has a size of 16 × 16 pixels, i.e., 256 pixels in total, and is divided into 4 subblocks of 8 × 8, and then transformed and quantized. After transformation and quantization, the macroblock data are divided into 16 layers (groups), wherein the layer 1 contains a DC component, the closer to the layer 16, the more the data embody picture details, most of the data of the high layer will become 0 after quantization, and the 0 can play an obvious volume compression effect after final coding, which is also the meaning of quantization; the progressive meaning is that when a macroblock changes, it will start transmission from its layer 0.
If a macro block is the same as the reference frame, firstly judging how many layers are transmitted when the reference frame is coded, and if all the layers are transmitted, the macro block is not required to be processed in the frame; if only M layers are transmitted due to bandwidth limitation when the reference frame is coded, and N layers are not transmitted behind the M layers; only the following N layers need to be transmitted for entropy coding transmission during the frame, the effect during decoding is that the previous frame is not clear, because only the previous M layers of data are received and the next frame becomes clear, and because the N layers are received on the basis of the M layers, the progressive effect is realized.
As an alternative embodiment, fig. 2 is a component schematic diagram of an alternative codec system according to an embodiment of the present application, where the codec system includes the following modules, and the encoding end mainly includes the following modules 401 to 410:
an image acquisition module 401, configured to acquire a computer image; a change detection module 402, configured to determine whether all macroblocks in the frame change in comparison with a reference frame, where the reference frame refers to a frame that has been encoded last time, and optionally, the reference frame is usually a previous frame, and the frame may be used as a reference when being encoded; a DCT transform and quantization module 403, configured to implement a transform and quantization stage of macroblock coding; each layer of data buffer 404, for storing the result of transformation and quantization of all macroblocks of the full frame; a code stream size prediction module 405, configured to predict the bit number of each layer in the finally encoded macroblock; for example, taking the nth layer as an example, if the data in the nth layer of the macroblock is not 0, the bit number of the layer after the final encoding may be estimated approximately.
A network bandwidth real-time monitoring module 406, configured to provide current bandwidth information, send the bandwidth information and bit number information of each layer of a macroblock to a to-be-transmitted layer number determining module 407, and determine the layer number of each macroblock by the to-be-transmitted layer number determining module 407; the number of layers to be transmitted determining module 407 is further configured to acquire interest point information or interest area information sent by the interest area determining module 408, where the interest area determining module 408 is further configured to detect whether a screen continues to move with a mouse, and if a near T frame (a dynamic value, which may be set arbitrarily, such as 30) finds that the mouse moves, it is considered that the interest area of the current human eye is on a keyboard cursor, rather than a keyboard typing cursor; otherwise, if the mouse is not moved and the keyboard cursor changes in the near T frame, the interest point is considered to be located at the point where the keyboard cursor is located.
Alternatively, the points of interest of a user's region of interest in an office scenario may typically appear in two types of cursors: mouse and typing cursor (keyboard cursor), this method can distinguish the cursor in which the user's interest point is specific, and subsequently, mention the concept of "cursor", the real user interest point cursor that the immediate interest zone determination module 408 of referring has specifically positioned, no longer repeated description of the concept of mouse or keyboard cursor.
In another optional embodiment, determining the number of layers to be transmitted for each of the plurality of macroblocks based on the position to be focused includes:
step S302, obtaining a first distance value between each macro block in the plurality of macro blocks and the macro block where the position to be noted is located;
step S304, dividing the plurality of macroblocks into a plurality of regions according to the first distance values;
step S306, determining the number of layers to be transmitted for each of the macroblocks based on the area where each of the macroblocks is located.
After the interest area determining module 408 determines the interest point of the user, a first distance value between each macro block of the plurality of macro blocks and the macro block where the position to be focused is located is calculated, that is, differences between horizontal coordinates and vertical coordinates of the target macro block and the macro block where the interest point is located are calculated to be squared and then added, and the smaller the distance is, the closer the target macro block is to the interest point is; after all the distance values are calculated, the plurality of macroblocks are divided into a plurality of areas, for example, 5 areas, according to the first distance values, and the number of layers to be transmitted of each macroblock is determined based on the area where each macroblock is located.
In an optional embodiment, determining the number of layers to be transmitted of each of the macroblocks based on the area where each of the macroblocks is located includes:
step S402, determining a second distance value between the area where each macro block is located and the position to be noted;
step S404, distributing corresponding bandwidth to be used for each area according to the size of the second distance value;
step S406, determining the number of layers to be transmitted for each macroblock based on the bandwidth to be used.
As an alternative embodiment, the bandwidth allocation of each set of interest areas may be calculated as follows: the region 1 is the region with the smallest distance from the interest point, and assuming that the total bandwidth is P, the allocated bandwidth in the region is P/2, i.e. half of the bandwidth is allocated to the region. The code stream size prediction module 405 is configured to record which layers all macro blocks in the area need to be transmitted, and if all the layers are completely transmitted and the P/2 bandwidth cannot be used, transmit the remaining bandwidth to the 2 nd area, otherwise, calculate the number of layers transmitted by each macro block in the area when the 0 th area uses up the P/2 bandwidth; subtracting the bandwidth used in the 0 th area from the P bandwidth to obtain a residual bandwidth, and continuously calculating the number of layers that each macro block in the 1 st area can transmit in the same way; and so on for all the following zones, if the bandwidth is exhausted by the Nth zone after the treatment, the following layers are not transmitted.
In the embodiment of the present application, a bandwidth allocation mechanism with a weight is actually generated according to the above manner, that is, the closer the target area is to the point of interest, the larger the allocated bandwidth quota is, that is, the more layers can be transmitted, and the clearer the corresponding decoding is.
As an alternative embodiment, fig. 3 is a flowchart of an alternative image data transmission method according to an embodiment of the present invention, and as shown in fig. 3, the method for processing image data at an alternative encoding end includes:
step S501: acquiring original image data of a current frame;
step S502: comparing the original image data of the current frame with the reference frame data to obtain a comparison result;
step S503: determining changed macro blocks and unchanged macro blocks according to the comparison result;
wherein, for the changed macro block, DCT transformation and quantization are required in step S504; the macroblock that has not changed directly jumps to the step S506 of dividing the region of interest because the previous frame has been DCT transformed and quantized.
Step S504: performing DCT transformation and quantization processing on the changed macro blocks;
step S505: caching the transformed and quantized data;
step S506: judging which cursor is the interest point according to the dynamic state of a mouse and keyboard cursor, and dividing the full-frame macro block into interest areas according to the position of the macro block where the interest point is located;
step S507: determining the number of layers to be transmitted of each macro block;
the interest zone closest to the interest point has the largest weight for allocating bandwidth, that is, the number of layers allowed to be transmitted is the largest, and the opposite interest zone farthest from the interest point has the smallest weight, the allocated bandwidth proportion is the smallest, and the number of layers allowed to be transmitted is the smallest.
Step S508: acquiring data of each macro block from a data buffer area to perform entropy coding processing;
step S509: and forming a final code stream based on the entropy coding result and transmitting the final code stream.
In an optional embodiment, before acquiring the position to be focused in the raw image data, the method further includes:
step S602, comparing the original image data with reference frame image data to determine changed macro blocks;
step S604, transform and quantization process are carried out to the changed macro block to obtain a processing result;
step S606, store the above processing result in the data buffer.
In an optional embodiment, the generating a code stream to be transmitted according to the number of layers to be transmitted of each macroblock in the plurality of macroblocks includes:
step S702, acquiring data to be transmitted from the data buffer according to the number of layers to be transmitted of each macro block in the plurality of macro blocks;
step S704, performing encoding processing on the data to be transmitted, and generating the code stream to be transmitted.
As shown in fig. 2, the entropy coding module 409 is configured to determine which layers of data to be transmitted are determined 407 and entropy coding is performed on the data of the layers, where the run length coding and Huffman coding may be used in the present application; and a transmission module 410, configured to transmit the code stream generated by the entropy coding module 409 to a decoding end.
The embodiment of the application provides a progressive image coding and decoding method based on an interest area, interest points concerned by a user are reasonably judged according to the cursor position of a keyboard and a mouse of the user through a coding end, then the interest areas of the user are divided, a key interest area is matched to more bandwidth and is allowed to transmit more layers, other non-key interest areas are transmitted with less layers, further, the continuity of image transmission under the condition that the bandwidth is limited and the image display quality of the interest areas of the user can be ensured, the traditional calculation load caused by calculating the interest areas by adopting a complex statistical method and an image processing algorithm is saved, and the progressive image coding and decoding method is particularly suitable for office scenes.
According to an embodiment of the present invention, another embodiment of a method for transmitting image data is provided, it should be noted that the steps shown in the flowchart of the figure may be executed in a computer system such as a set of computer executable instructions, and that although a logical order is shown in the flowchart, in some cases, the steps shown or described may be executed in an order different from that here.
Fig. 4 is a flowchart of another image data transmission method according to an embodiment of the present invention, as shown in fig. 4, the method including the steps of:
step S802, receiving a code stream to be reconstructed from a coding end, wherein the code stream to be reconstructed is generated based on original image data of a current frame collected by the coding end, the original image data comprises a plurality of macro blocks, and the number of layers to be transmitted of each macro block in the macro blocks is determined based on a position to be concerned in the original image data;
step S804, reconstructing the code stream to be reconstructed, and generating image data to be played.
In the embodiment of the invention, a code stream to be reconstructed from a coding end is received, wherein the code stream to be reconstructed is generated based on original image data of a current frame acquired by the coding end, the original image data comprises a plurality of macro blocks, and the number of layers to be transmitted of each macro block in the macro blocks is determined based on a position to be concerned in the original image data; the code stream to be reconstructed is reconstructed to generate image data to be played, and the purpose of ensuring the definition of the user interest area in the process of transmitting the image data under the progressive coding and decoding scene is achieved, so that the technical effect of improving the processing efficiency of the progressive coding and decoding of the image data is realized, and the technical problem that the definition of the user interest area in the image data cannot be ensured in the process of transmitting the image data by the progressive coding and decoding scheme in the related technology is solved.
As also shown in fig. 2, the following modules are mainly included in the decoding end: a transmission module 411, configured to receive code stream data; an entropy decoding module 412, configured to perform decoding processing corresponding to entropy coding in the encoding end on the encoded stream data; each layer of data buffer layer 413 is used for storing each layer of data when decoding the previous frame of code stream data, and if the data of the layer to be transmitted continuously is received, the data of the layer to be transmitted continuously is superposed on the data of the layer of the previous frame to form a progressive effect; each frame of incoming layer data is cached in each layer of data cache layer 413; an inverse quantization and inverse DCT transform 414 module, configured to obtain reconstructed data through inverse quantization and inverse DCT transform processing after superimposing layer data of a previous layer; and a reconstructed frame module 415, configured to combine the reconstructed data of all the macroblocks to generate a complete reconstructed frame, so as to complete the decoding process.
Optionally, the original image data may be video image data, and the original image data includes: a plurality of macroblocks.
The method comprises the steps of introducing a concept of a user interest area on the basis of a current progressive scheme of layered transmission, namely acquiring a position to be concerned in original image data, and determining the number of layers to be transmitted of each macro block in a plurality of macro blocks based on the position to be concerned; under the same bandwidth limitation, transmitting macroblock layer data with large weight in a position to be noted in a limited way, and transmitting macroblocks in the position to be noted as completely as possible, wherein the distance value of each region from a region to be noted is sorted according to priority, and the closer the distance to the position to be noted is, the higher the priority level of the macroblock is, the larger bandwidth is allocated; the farther away from the location to be attended, the lower the priority level of the macroblock, and the less bandwidth is allocated.
In the embodiment of the application, a code stream to be transmitted is generated according to the number of layers to be transmitted of each macro block in the plurality of macro blocks, and the code stream to be transmitted is transmitted to a decoding end according to the bandwidth condition. The method and the device for processing the image data have the advantages of the existing scheme, can improve the definition of the interest area of the user under the condition that the bandwidth is limited, and are particularly suitable for office scenes with high requirements for the definition of the attention area.
As an alternative embodiment, the time stamp can be clearly distinguished by intercepting the raw image data (e.g., video pictures) before the image encoding process from a blur to clear progression as a contrasting uncoded lossless raw picture with a control small window shown in the picture. Since the full-frame image is mostly changed, the degree of blurring of the original image data is substantially uniform for the full frame, for example, the face of a person displayed in the original image data is also clear, and if the current user simply watches the video online, such a scene is acceptable to the user. However, when the original image data contains the display information (e.g., control tool button, time display information) which is small and critical, the critical display information cannot be clearly displayed, for example, the time stamp cannot be clearly shown. If the current user needs to perform operations such as image acquisition and image clipping according to the time stamp, the definition requirement of the user on the attention area can hardly be met in the situation.
Obtaining a position to be focused in original image data through the embodiment of the application; for example, when it is detected that the mouse of the user is located in a certain control area of a frame of original image data, it is determined that a preset radius area centered on the mouse is the area most interested by the user, i.e., the position to be focused on, and the priority of the transmission data layer of the position to be focused on is the highest, so that the timestamp and the button around the position to be focused on are displayed clearly, and the area farther from the position to be focused on is displayed in a blurred manner.
It should be noted that more similar situations may occur in a real office scene, and since the focus of attention of the user is usually at the position of the cursor when the user works normally, the application scheme adopts the position of the mouse or keyboard cursor as the center of the user interest area, which is in accordance with the actual situation of the office scene. Most operating systems support the acquisition of the position of the cursor, so that the consumption of computing resources for image recognition calculation for finding the cursor can be avoided.
Example 2
According to an embodiment of the present invention, there is further provided an embodiment of an apparatus for implementing the method for transmitting image data, and fig. 5 is a schematic structural diagram of an apparatus for transmitting image data according to an embodiment of the present invention, and as shown in fig. 5, the apparatus for transmitting image data includes: an acquisition module 500, an acquisition module 502, a determination module 504, and a processing module 506, wherein:
an acquiring module 500, configured to acquire original image data of a current frame, where the original image data includes: a plurality of macroblocks; an obtaining module 502, configured to obtain a position to be focused in the original image data; a determining module 504, configured to determine, based on the to-be-focused position, a to-be-transmitted layer number of each macroblock in the multiple macroblocks; the processing module 506 is configured to generate a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the plurality of macro blocks, and transmit the code stream to be transmitted to a decoding end.
It should be noted here that the above-mentioned acquisition module 500, acquisition module 502, determination module 504 and processing module 506 correspond to steps S102 to S108 in embodiment 1, and the above-mentioned modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure in embodiment 1. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
According to an embodiment of the present invention, there is further provided an embodiment of an apparatus for implementing the method for transmitting image data, fig. 6 is a schematic structural diagram of an apparatus for transmitting image data according to an embodiment of the present invention, and as shown in fig. 6, the apparatus for transmitting image data includes: a receiving module 600 and a processing module 602, wherein:
a receiving module 600, configured to receive a code stream to be reconstructed from an encoding end, where the code stream to be reconstructed is generated based on original image data of a current frame acquired by the encoding end, the original image data includes a plurality of macro blocks, and a number of layers to be transmitted of each macro block in the plurality of macro blocks is determined based on a position to be focused in the original image data; the processing module 602 is configured to reconstruct the code stream to be reconstructed, and generate image data to be played.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted here that the receiving module 600 and the processing module 602 correspond to steps S802 to S804 in embodiment 1, and the modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure in embodiment 1. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
It should be noted that, reference may be made to the relevant description in embodiment 1 for alternative or preferred embodiments of this embodiment, and details are not described here again.
The above-mentioned image data transmission device may further include a processor and a memory, and the above-mentioned acquisition module 500, the acquisition module 502, the determination module 504 and the processing module 506, the receiving module 600 and the processing module 602, etc. are all stored in the memory as program units, and the processor executes the above-mentioned program units stored in the memory to implement corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory, wherein one or more than one kernel can be arranged. The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
According to the embodiment of the application, the embodiment of the nonvolatile storage medium is also provided. Optionally, in this embodiment, the nonvolatile storage medium includes a stored program, and the apparatus in which the nonvolatile storage medium is located is controlled to execute the any one of the image data transmission methods when the program runs.
Optionally, in this embodiment, the nonvolatile storage medium may be located in any one of a group of computer terminals in a computer network, or in any one of a group of mobile terminals, and the nonvolatile storage medium includes a stored program.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: acquiring original image data of a current frame, wherein the original image data comprises: a plurality of macroblocks; acquiring a position to be focused in the original image data; determining the number of layers to be transmitted of each macro block in the plurality of macro blocks based on the position to be concerned; and generating a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the plurality of macro blocks, and transmitting the code stream to be transmitted to a decoding end.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: detecting whether a mouse cursor and/or a keyboard cursor continuously moves in a continuously played multi-frame historical image; when the mouse cursor continuously moves and the keyboard cursor does not move, determining the position to be focused on based on the moving range of the mouse cursor; and when the mouse cursor is not moved and the keyboard cursor is continuously moved, determining the position to be focused on based on the moving range of the keyboard cursor.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: acquiring a first distance value between each macro block in the plurality of macro blocks and the macro block where the position to be noted is located; dividing the plurality of macroblocks into a plurality of regions according to the first distance values; and determining the number of layers to be transmitted of each macro block based on the area where each macro block is located.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: determining a second distance value between the area where each macro block is located and the position to be noted; allocating a corresponding bandwidth to be used for each area according to the size of the second distance value; and determining the number of layers to be transmitted of each macro block based on the bandwidth to be used.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: comparing the original image data with reference frame image data to determine changed macro blocks; transforming and quantizing the changed macro blocks to obtain processing results; and storing the processing result to a data buffer area.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: acquiring data to be transmitted from the data buffer according to the number of layers to be transmitted of each macro block in the plurality of macro blocks; and coding the data to be transmitted to generate the code stream to be transmitted.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: receiving a code stream to be reconstructed from a coding end, wherein the code stream to be reconstructed is generated based on original image data of a current frame acquired by the coding end, the original image data comprises a plurality of macro blocks, and the number of layers to be transmitted of each macro block in the macro blocks is determined based on a position to be concerned in the original image data; and reconstructing the code stream to be reconstructed to generate image data to be played.
According to the embodiment of the application, the embodiment of the processor is also provided. Optionally, in this embodiment, the processor is configured to execute a program, where the program executes the method for transmitting the image data.
According to an embodiment of the present application, there is also provided an electronic apparatus including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform any one of the above image data transmission methods.
According to an embodiment of the present application, there is also provided a computer program product embodiment, which, when executed on a data processing device, is adapted to execute a program that initializes the steps of the transmission method of image data of any of the above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer-readable nonvolatile storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a non-volatile storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the above methods according to the embodiments of the present invention. And the aforementioned nonvolatile storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (12)
1. A method for transmitting image data, comprising:
acquiring original image data of a current frame, wherein the original image data comprises: a plurality of macroblocks;
acquiring a position to be focused in the original image data;
determining the number of layers to be transmitted of each macro block in the plurality of macro blocks based on the position to be concerned;
and generating a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the plurality of macro blocks, and transmitting the code stream to be transmitted to a decoding end.
2. The transmission method according to claim 1, wherein acquiring a position to be focused in the raw image data comprises:
detecting whether a mouse cursor and/or a keyboard cursor continuously moves in a continuously played multi-frame historical image;
when the mouse cursor continuously moves and the keyboard cursor does not move, determining the position to be focused on based on the movement range of the mouse cursor;
when the mouse cursor is not moved and the keyboard cursor is continuously moved, the position to be focused on is determined based on the movement range of the keyboard cursor.
3. The transmission method according to claim 1, wherein determining the number of layers to be transmitted for each of the plurality of macroblocks based on the location to be attended comprises:
acquiring a first distance value between each macro block in the plurality of macro blocks and the macro block where the position to be noted is located;
dividing the plurality of macroblocks into a plurality of regions according to the first distance values;
and determining the number of layers to be transmitted of each macro block based on the area where each macro block is located.
4. The transmission method according to claim 3, wherein determining the number of layers to be transmitted for each macroblock based on the area in which each macroblock is located comprises:
determining a second distance value between the area where each macro block is located and the position to be noted;
distributing corresponding bandwidth to be used for each area according to the size of the second distance value;
and determining the number of layers to be transmitted of each macro block based on the bandwidth to be used.
5. The transmission method according to claim 1, wherein before acquiring the location to be focused on in the raw image data, the method further comprises:
comparing the original image data with reference frame image data to determine changed macro blocks;
transforming and quantizing the changed macro blocks to obtain processing results;
and storing the processing result to a data buffer area.
6. The transmission method according to claim 5, wherein generating the code stream to be transmitted according to the number of layers to be transmitted of each of the plurality of macroblocks comprises:
acquiring data to be transmitted from the data buffer according to the number of layers to be transmitted of each macro block in the plurality of macro blocks;
and coding the data to be transmitted to generate the code stream to be transmitted.
7. A method for transmitting image data, comprising:
receiving a code stream to be reconstructed from a coding end, wherein the code stream to be reconstructed is generated based on original image data of a current frame collected by the coding end, the original image data comprises a plurality of macro blocks, and the number of layers to be transmitted of each macro block in the macro blocks is determined based on a position to be concerned in the original image data;
and reconstructing the code stream to be reconstructed to generate image data to be played.
8. An apparatus for transmitting image data, comprising:
an acquisition module, configured to acquire original image data of a current frame, where the original image data includes: a plurality of macroblocks;
the acquisition module is used for acquiring a position to be focused in the original image data;
a determining module, configured to determine, based on the to-be-focused position, a number of layers to be transmitted of each macroblock in the plurality of macroblocks;
and the processing module is used for generating a code stream to be transmitted according to the number of layers to be transmitted of each macro block in the plurality of macro blocks and transmitting the code stream to be transmitted to a decoding end.
9. An apparatus for transmitting image data, comprising:
the device comprises a receiving module and a processing module, wherein the receiving module is used for receiving a code stream to be reconstructed from a coding end, the code stream to be reconstructed is generated based on original image data of a current frame acquired by the coding end, the original image data comprises a plurality of macro blocks, and the number of layers to be transmitted of each macro block in the macro blocks is determined based on a position to be concerned in the original image data;
and the processing module is used for reconstructing the code stream to be reconstructed and generating image data to be played.
10. A non-volatile storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the method of transferring image data according to any one of claims 1 to 7.
11. A processor for running a program, wherein the program is arranged to perform the method of image data transmission according to any one of claims 1 to 7 when running.
12. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the method of transmitting image data according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011173524.2A CN112351282A (en) | 2020-10-28 | 2020-10-28 | Image data transmission method and device, nonvolatile storage medium and processor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011173524.2A CN112351282A (en) | 2020-10-28 | 2020-10-28 | Image data transmission method and device, nonvolatile storage medium and processor |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112351282A true CN112351282A (en) | 2021-02-09 |
Family
ID=74359270
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011173524.2A Pending CN112351282A (en) | 2020-10-28 | 2020-10-28 | Image data transmission method and device, nonvolatile storage medium and processor |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112351282A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113766274A (en) * | 2021-09-23 | 2021-12-07 | 阿里云计算有限公司 | Image encoding method, image decoding method, electronic device, and storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101340321A (en) * | 2008-08-28 | 2009-01-07 | 北京中星微电子有限公司 | Solution of adaptive bandwidth in real-time monitoring system |
CN101588496A (en) * | 2008-05-20 | 2009-11-25 | 吴平 | A kind of communication system of the finely-coded local regions based on video content |
CN102855059A (en) * | 2012-08-21 | 2013-01-02 | 东莞宇龙通信科技有限公司 | Terminal and information sharing method |
CN105247855A (en) * | 2014-04-29 | 2016-01-13 | 华为技术有限公司 | Screen sharing method, device and system |
CN106131670A (en) * | 2016-07-12 | 2016-11-16 | 块互动(北京)科技有限公司 | A kind of adaptive video coding method and terminal |
US20170180758A1 (en) * | 2015-12-22 | 2017-06-22 | Vallabhajosyula S. Somayazulu | Tiled Wireless Display |
CN206283618U (en) * | 2016-07-26 | 2017-06-27 | 公安部第一研究所 | Spatial scalable coding device based on area-of-interest |
CN110035289A (en) * | 2019-04-24 | 2019-07-19 | 润电能源科学技术有限公司 | A kind of layered compression method of screen picture, system and relevant apparatus |
CN110555854A (en) * | 2011-06-22 | 2019-12-10 | 皇家飞利浦有限公司 | System and method for processing medical images |
CN110572656A (en) * | 2019-09-19 | 2019-12-13 | 北京视博云科技有限公司 | coding method, image processing method, device and system |
CN111464811A (en) * | 2020-04-09 | 2020-07-28 | 西安万像电子科技有限公司 | Image processing method, device and system |
CN111556318A (en) * | 2020-04-24 | 2020-08-18 | 西安万像电子科技有限公司 | Data transmission method and device |
-
2020
- 2020-10-28 CN CN202011173524.2A patent/CN112351282A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101588496A (en) * | 2008-05-20 | 2009-11-25 | 吴平 | A kind of communication system of the finely-coded local regions based on video content |
CN101340321A (en) * | 2008-08-28 | 2009-01-07 | 北京中星微电子有限公司 | Solution of adaptive bandwidth in real-time monitoring system |
CN110555854A (en) * | 2011-06-22 | 2019-12-10 | 皇家飞利浦有限公司 | System and method for processing medical images |
CN102855059A (en) * | 2012-08-21 | 2013-01-02 | 东莞宇龙通信科技有限公司 | Terminal and information sharing method |
CN105247855A (en) * | 2014-04-29 | 2016-01-13 | 华为技术有限公司 | Screen sharing method, device and system |
US20170180758A1 (en) * | 2015-12-22 | 2017-06-22 | Vallabhajosyula S. Somayazulu | Tiled Wireless Display |
CN106131670A (en) * | 2016-07-12 | 2016-11-16 | 块互动(北京)科技有限公司 | A kind of adaptive video coding method and terminal |
CN206283618U (en) * | 2016-07-26 | 2017-06-27 | 公安部第一研究所 | Spatial scalable coding device based on area-of-interest |
CN110035289A (en) * | 2019-04-24 | 2019-07-19 | 润电能源科学技术有限公司 | A kind of layered compression method of screen picture, system and relevant apparatus |
CN110572656A (en) * | 2019-09-19 | 2019-12-13 | 北京视博云科技有限公司 | coding method, image processing method, device and system |
CN111464811A (en) * | 2020-04-09 | 2020-07-28 | 西安万像电子科技有限公司 | Image processing method, device and system |
CN111556318A (en) * | 2020-04-24 | 2020-08-18 | 西安万像电子科技有限公司 | Data transmission method and device |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113766274A (en) * | 2021-09-23 | 2021-12-07 | 阿里云计算有限公司 | Image encoding method, image decoding method, electronic device, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111010495B (en) | Video denoising processing method and device | |
CN112383777B (en) | Video encoding method, video encoding device, electronic equipment and storage medium | |
CN110166771B (en) | Video encoding method, video encoding device, computer equipment and storage medium | |
US20120307904A1 (en) | Partial frame utilization in video codecs | |
TW201309040A (en) | Adaptive configuration of reference frame buffer based on camera and background motion | |
US20120195376A1 (en) | Display quality in a variable resolution video coder/decoder system | |
KR20140110008A (en) | Object detection informed encoding | |
CN111182303A (en) | Encoding method and device for shared screen, computer readable medium and electronic equipment | |
CN112584119B (en) | Self-adaptive panoramic video transmission method and system based on reinforcement learning | |
US10812832B2 (en) | Efficient still image coding with video compression techniques | |
CN1914925A (en) | Image compression for transmission over mobile networks | |
US20120195364A1 (en) | Dynamic mode search order control for a video encoder | |
CN114222127A (en) | Video coding method, video decoding method and device | |
US10536726B2 (en) | Pixel patch collection for prediction in video coding system | |
CN111182310A (en) | Video processing method and device, computer readable medium and electronic equipment | |
CN112351282A (en) | Image data transmission method and device, nonvolatile storage medium and processor | |
CN112218087B (en) | Image encoding and decoding method, encoding and decoding device, encoder and decoder | |
CN113852816A (en) | Video frame processing method and device, computer equipment and readable storage medium | |
US10735773B2 (en) | Video coding techniques for high quality coding of low motion content | |
CN113366842B (en) | System and method for content layer based video compression | |
CN116847087A (en) | Video processing method and device, storage medium and electronic equipment | |
CN113810692B (en) | Method for framing changes and movements, image processing device and program product | |
CN115834906A (en) | Video encoding and decoding method and device, electronic equipment and medium | |
CN111212288B (en) | Video data encoding and decoding method and device, computer equipment and storage medium | |
CN111193926B (en) | Encoded data processing method, apparatus, computer device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210209 |